A site I crawl via python's requests library and crawler recently made a change where instead of timing out or returning a non-200 error code, it now returns a valid page saying I've exceeded the quota for my IP address.
Is there a header I can pass to ensure my Crawlera request goes through a new IP?
I have the same question. I need a new IP for every crawl as it is off the same website and would show a different page if the IP has been used in the last 24 h already. How can I accomplish that?
A site I crawl via python's requests library and crawler recently made a change where instead of timing out or returning a non-200 error code, it now returns a valid page saying I've exceeded the quota for my IP address.
Is there a header I can pass to ensure my Crawlera request goes through a new IP?
0 Votes
1 Comments
Frank Mustermann posted over 3 years ago
I have the same question. I need a new IP for every crawl as it is off the same website and would show a different page if the IP has been used in the last 24 h already. How can I accomplish that?
0 Votes
Login to post a comment