Start a new topic
Answered

Scraping from URL list: list not found

I am planning to run a scraper with Scrapinghub which is requesting URLs from an external list. Both the scraper and the URL list have been saved in a local project folder on my drive.


When running the script on Scrapinghub, I am however receiving the following error:


File "/app/__main__.egg/TermSpider/spiders/quotes.py", line 8, in start_requests     with open('urls.csv') as file: IOError: [Errno 2] No such file or directory: 'urls.csv'


Does anyone have an idea why the URL list is not found? Do I have to upload it manually to Scrapinghub? I have used the manual deploy function so far and assumed that the entire project folder had been transferred.


Best Answer


Please ensure that csv file also has been deployed to scrapy cloud as given in https://support.scrapinghub.com/solution/articles/22000200416-deploying-non-code-files only then you would be able to access the file in Scrapy cloud.




Answer


Please ensure that csv file also has been deployed to scrapy cloud as given in https://support.scrapinghub.com/solution/articles/22000200416-deploying-non-code-files only then you would be able to access the file in Scrapy cloud.



Thanks, I have included the url list in the code of the script now.

Login to post a comment