Start a new topic
Answered

Scrapy & Crawlera: not using all concurrent requests?

This has been an ongoing issue with crawlera... i never seem to reach the concurrent requests on my plan (it says i have 100).  I have set CONCURRENT_REQUESTS = 100 and even tried 1024 in my settings.py and it does not change anything. 


When i check Usage Stats on crawlera web app, it says i am only using 8 concurrency.  So i'm paying for a plan that i'm not even using, even though i've look at all documentation and help questions and tried it.  Please address this issue.


Best Answer

The setting CONCURRENT_REQUESTS_PER_DOMAIN has a value of 8 by default:  https://docs.scrapy.org/en/latest/topics/settings.html#concurrent-requests-per-domain 


Answer

The setting CONCURRENT_REQUESTS_PER_DOMAIN has a value of 8 by default:  https://docs.scrapy.org/en/latest/topics/settings.html#concurrent-requests-per-domain 

@nestor thanks for the response.  I have enabled that and see that it works... but still not reaching my plan which is 100 concurrent req.  I am reaching only 60, even though i've changed the settings to 100.

I see the latest concurrency is around 99, we have also made some configuration changes for handling the load for this domain. 


For any issues further, I suggest please contact Support team through Dashboard > Help > Contact Support team. Support is available for weekdays for paying customers. They would surely assist you with your queries or technical assistance. 

Login to post a comment