With Scrapy, I have a job that has generated a few errors due to data in the response not being in the format the scraper was expecting. What is the best way to store those responses so i can go back and look at them afterwards? Is there a setting i can do for this?
macqres
With Scrapy, I have a job that has generated a few errors due to data in the response not being in the format the scraper was expecting. What is the best way to store those responses so i can go back and look at them afterwards? Is there a setting i can do for this?
https://app.scrapinghub.com/p/212215/addons/page_storage seems to be a bit different. In my case there wasn't an error with the response itself, it is simply that my code didn't handle it correctly.
Hi,
The best way is to check on the logs of your particular job and filter by errors, check this example:
https://app.scrapinghub.com/p/212215/1/32/log?filterAndHigher&filterType=error
Best,
Pablo
vaz
Hi,
The best way is to check on the logs of your particular job and filter by errors, check this example:
https://app.scrapinghub.com/p/212215/1/32/log?filterAndHigher&filterType=error
Best,
Pablo
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 453 topics