Logo

Zyte Support Center

Welcome
Login
Home Solutions Forums Training Documentation Contact Support

How can we help you today?

Login to submit a new ticket
Check ticket status
Solution home Scrapy Cloud Scrapy Cloud FAQ

Is it possible to schedule jobs to run sequentially?

Modified on: Wed, 3 Feb, 2021 at 8:34 AM


It can be implemented by combining Scrapy Signals with a job scheduling API call (see Jobs API and Scrapinghub's Python client). The idea is to catch the spider_closed signal and trigger the next job via API or the client.


Did you find it helpful? Yes No

Send feedback
Sorry we couldn't be helpful. Help us improve this article with your feedback.

Related Articles

    Home Solutions Forums Training
    Article views count