Using Python Requests + Crawlera is there a way to force an IP rotation?

Posted about 4 years ago by engineeringsstm2

Post a topic
Un Answered
e
engineeringsstm2

A site I crawl via python's requests library and crawler recently made a change where instead of timing out or returning a non-200 error code, it now returns a valid page saying I've exceeded the quota for my IP address.


Is there a header I can pass to ensure my Crawlera request goes through a new IP?

 

# Sample code
import requests

url = "SOME_URL"
proxy_host = "proxy.crawlera.com"
proxy_port = "8010"
proxy_auth = "CRAWLERA_KEY"
proxies = {
       "http": "http://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port),
}
response = requests.get(url, proxies=proxies, verify=False, timeout=20)

    

0 Votes


1 Comments

F

Frank Mustermann posted over 3 years ago

I have the same question. I need a new IP for every crawl as it is off the same website and would show a different page if the IP has been used in the last 24 h already. How can I accomplish that?

0 Votes

Login to post a comment