Start a new topic

Updating a file after each crawl

I was wondering if it's possible to load a database, crawl data and update the database, and then overwrite the old database on Scrapy Cloud. 


For example, if i am tracking the price of one product, I'd like to store all the old values and dates. Is there a way to use Scrapy Cloud to crawl and get the data and then update a file (to download later)?

Login to post a comment