It does not solves my purpose. I want to invoke spider via scrapinghub api, currently it rejects the same spider job if it's already in process. Is there a possibility to queue the same spider job using api?
Does it helps If I buy more containers????
If I have n containers, Can I schedule / run same spider N times simultaneously / In queue?
If you have N containers you can run N different spiders, perhaps you can clone spiders and run simultaneously.
We have provided extensive documentation of our API here: https://doc.scrapinghub.com/scrapy-cloud.html#
If you still find difficult to follow, please consider to hire our experts through https://scrapinghub.com/quote, it can save you a lot of time and resources.
Can you briefly explain "perhaps you can clone spiders and run simultaneously"? Is it copying spider into multiple files or creating different projects with same spider?