Start a new topic

Python Requests Integration not working.

 

Hi, it's been 2 hrs of searching for answers here in community, but I haven't found one.  I already downloaded the crawlera cert to remove the warning message, but still, I'm not having any response from the url.  It should simply give me my IP address.  There's no error.  It's just blank.

import requests

px = {
    "http":"http://<MYAPIKEY>:@proxy.crawlera.com:8010",
    "https":"https://<MYAPIKEY>:@proxy.crawlera.com:8010"
}

url = "https://api.ipify.org/"

response = requests.request("GET", url, proxies=px, verify="crawlera-ca.crt")

print(response.text)


 Sample execution:


image


 

import requests

 

px = {

    "http":"http://<MYAPIKEY>:@proxy.crawlera.com:8010/",

    "https":"https://<MYAPIKEY>:@proxy.crawlera.com:8010/"

}

 

url = "https://api.ipify.org/"

 

response = requests.request("GET", url, proxies=px, verify="crawlera-ca.crt")

 

print(response.text)

I think when you define the https proxy for crawlera, the proxy URL itself doesn't need to be https:

"https":"http://apikey_here:@proxy.crawlera.com:8010"
Login to post a comment