Updating a file after each crawl

Posted over 6 years ago by sophiayu813

Post a topic
Un Answered
s
sophiayu813

I was wondering if it's possible to load a database, crawl data and update the database, and then overwrite the old database on Scrapy Cloud. 


For example, if i am tracking the price of one product, I'd like to store all the old values and dates. Is there a way to use Scrapy Cloud to crawl and get the data and then update a file (to download later)?

0 Votes


0 Comments

Login to post a comment