Scrapy & Crawlera: not using all concurrent requests?

Posted almost 5 years ago by empyreone

Post a topic
Answered
e
empyreone

This has been an ongoing issue with crawlera... i never seem to reach the concurrent requests on my plan (it says i have 100).  I have set CONCURRENT_REQUESTS = 100 and even tried 1024 in my settings.py and it does not change anything. 


When i check Usage Stats on crawlera web app, it says i am only using 8 concurrency.  So i'm paying for a plan that i'm not even using, even though i've look at all documentation and help questions and tried it.  Please address this issue.

0 Votes

nestor

nestor posted almost 5 years ago Admin Best Answer

The setting CONCURRENT_REQUESTS_PER_DOMAIN has a value of 8 by default:  https://docs.scrapy.org/en/latest/topics/settings.html#concurrent-requests-per-domain 

0 Votes


3 Comments

Sorted by
thriveni

thriveni posted almost 5 years ago Admin

I see the latest concurrency is around 99, we have also made some configuration changes for handling the load for this domain. 


For any issues further, I suggest please contact Support team through Dashboard > Help > Contact Support team. Support is available for weekdays for paying customers. They would surely assist you with your queries or technical assistance. 

0 Votes

e

empyreone posted almost 5 years ago

@nestor thanks for the response.  I have enabled that and see that it works... but still not reaching my plan which is 100 concurrent req.  I am reaching only 60, even though i've changed the settings to 100.

0 Votes

nestor

nestor posted almost 5 years ago Admin Answer

The setting CONCURRENT_REQUESTS_PER_DOMAIN has a value of 8 by default:  https://docs.scrapy.org/en/latest/topics/settings.html#concurrent-requests-per-domain 

0 Votes

Login to post a comment