Start a new topic

Using Python Requests + Crawlera is there a way to force an IP rotation?

A site I crawl via python's requests library and crawler recently made a change where instead of timing out or returning a non-200 error code, it now returns a valid page saying I've exceeded the quota for my IP address.


Is there a header I can pass to ensure my Crawlera request goes through a new IP?

 

# Sample code
import requests

url = "SOME_URL"
proxy_host = "proxy.crawlera.com"
proxy_port = "8010"
proxy_auth = "CRAWLERA_KEY"
proxies = {
       "http": "http://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port),
}
response = requests.get(url, proxies=proxies, verify=False, timeout=20)

    

1 Comment

I have the same question. I need a new IP for every crawl as it is off the same website and would show a different page if the IP has been used in the last 24 h already. How can I accomplish that?

Login to post a comment