Scraping from URL list: list not found

Posted over 5 years ago by shaping2startups

Post a topic
Answered
s
shaping2startups

I am planning to run a scraper with Scrapinghub which is requesting URLs from an external list. Both the scraper and the URL list have been saved in a local project folder on my drive.


When running the script on Scrapinghub, I am however receiving the following error:


File "/app/__main__.egg/TermSpider/spiders/quotes.py", line 8, in start_requests     with open('urls.csv') as file: IOError: [Errno 2] No such file or directory: 'urls.csv'


Does anyone have an idea why the URL list is not found? Do I have to upload it manually to Scrapinghub? I have used the manual deploy function so far and assumed that the entire project folder had been transferred.

0 Votes

thriveni

thriveni posted over 5 years ago Admin Best Answer


Please ensure that csv file also has been deployed to scrapy cloud as given in https://support.scrapinghub.com/solution/articles/22000200416-deploying-non-code-files only then you would be able to access the file in Scrapy cloud.



0 Votes


2 Comments

Sorted by
s

shaping2startups posted over 5 years ago

Thanks, I have included the url list in the code of the script now.

0 Votes

thriveni

thriveni posted over 5 years ago Admin Answer


Please ensure that csv file also has been deployed to scrapy cloud as given in https://support.scrapinghub.com/solution/articles/22000200416-deploying-non-code-files only then you would be able to access the file in Scrapy cloud.



0 Votes

Login to post a comment