Scrapinghub: concurrent requests settings appear not to be respected

Posted over 5 years ago by Rodrigo Palacios

Post a topic
Answered
R
Rodrigo Palacios

Hi scrapinghub team, we have a high priority crawl job which needs to finish by tomorrow. We brought up CONCURRENT_REQUESTS and CONCURRENT_REQUESTS_PER_DOMAIN settings to 100 each, and we're still only seeing a rough average of 27 requests per min. 


Observing the request log, it also appears that our requests aren't being sent out in parallel. Any help is appreciated. 

0 Votes

Adriana Anghel

Adriana Anghel posted over 5 years ago Admin Best Answer

Hi! I have converted this into a support ticket - you should have received an email notification. One of our Support Engineers will be in touch soon. Bear in mind that while you have an active subscription with us, you are able to open support tickets directly from your dashboard, under Help - Contact Support. We respond to support tickets within one business day. The forum is for customers who don't have an active subscription with us.

0 Votes


1 Comments

Adriana Anghel

Adriana Anghel posted over 5 years ago Admin Answer

Hi! I have converted this into a support ticket - you should have received an email notification. One of our Support Engineers will be in touch soon. Bear in mind that while you have an active subscription with us, you are able to open support tickets directly from your dashboard, under Help - Contact Support. We respond to support tickets within one business day. The forum is for customers who don't have an active subscription with us.

0 Votes

Login to post a comment