For some time I've been running a simple scrape of search result count from the first page of results for a given search query. This works, and still works, but it's INCREDIBLY slow. Granted, I'm not doing concurrent requests, but is there any way to speed up the per request speed? Submitting different headers? Different Crawlera settings? I'm using the httr package in R to submit one request at a time and retrying on failure of the request.
For context: individual requests can take up to 3 minutes to be returned, after which I timeout the request and try again.