Queue

Posted over 7 years ago by indospirit1

Post a topic
Answered
i
indospirit1

Can I somehow queuing  scraping request for same spider? currently if I try to call spiders multiple times using scraping-hub api, it rejects latter.

0 Votes

vaz

vaz posted over 7 years ago Best Answer

Hi Indospirit1,


Have you tried Scheduling jobs?


Best,


Pablo

0 Votes


4 Comments

Sorted by
i

indospirit1 posted over 7 years ago

Hi vaz,


Can you  briefly explain "perhaps you can clone spiders and run simultaneously"? Is it copying spider into multiple files or creating different projects with same spider?

0 Votes

vaz

vaz posted over 7 years ago

Hi Indospirit,


If you have N containers you can run N different spiders, perhaps you can clone spiders and run simultaneously.


We have provided extensive documentation of our API here: https://doc.scrapinghub.com/scrapy-cloud.html#


If you still find difficult to follow, please consider to hire our experts through https://scrapinghub.com/quote, it can save you a lot of time and resources.


Best regards,


Pablo

0 Votes

i

indospirit1 posted over 7 years ago

Hi,


It does not solves my purpose. I want to invoke spider via scrapinghub api, currently it rejects the same spider job if it's already in process. Is there a possibility to queue the same spider job using api? 


Does it helps If I buy more containers???? 


If I have n containers, Can I schedule / run same spider N times simultaneously / In queue?

0 Votes

vaz

vaz posted over 7 years ago Answer

Hi Indospirit1,


Have you tried Scheduling jobs?


Best,


Pablo

0 Votes

Login to post a comment