Python Requests Integration not working.

Posted almost 4 years ago by Rylie Galicia

Post a topic
Un Answered
R
Rylie Galicia

 

Hi, it's been 2 hrs of searching for answers here in community, but I haven't found one.  I already downloaded the crawlera cert to remove the warning message, but still, I'm not having any response from the url.  It should simply give me my IP address.  There's no error.  It's just blank.

import requests

px = {
    "http":"http://<MYAPIKEY>:@proxy.crawlera.com:8010",
    "https":"https://<MYAPIKEY>:@proxy.crawlera.com:8010"
}

url = "https://api.ipify.org/"

response = requests.request("GET", url, proxies=px, verify="crawlera-ca.crt")

print(response.text)


 Sample execution:


image

0 Votes


2 Comments

Sorted by
A

Attila Toth posted almost 4 years ago

I think when you define the https proxy for crawlera, the proxy URL itself doesn't need to be https:

"https":"http://apikey_here:@proxy.crawlera.com:8010"

0 Votes

V

Vinay Mehendiratta posted almost 4 years ago

 

import requests

 

px = {

    "http":"http://<MYAPIKEY>:@proxy.crawlera.com:8010/",

    "https":"https://<MYAPIKEY>:@proxy.crawlera.com:8010/"

}

 

url = "https://api.ipify.org/"

 

response = requests.request("GET", url, proxies=px, verify="crawlera-ca.crt")

 

print(response.text)

0 Votes

Login to post a comment