I have a spider that suddenly throws this exception on about a half of by requests:
[scrapy.utils.signal] Error caught on signal handler: <bound method FeedExporter.item_scraped of <scrapy.extensions.feedexport.FeedExporter object at 0x7fde660b5978>> Less Traceback (most recent call last): File "/usr/local/lib/python3.7/site-packages/twisted/internet/defer.py", line 151, in maybeDeferred result = f(*args, **kw) File "/usr/local/lib/python3.7/site-packages/pydispatch/robustapply.py", line 55, in robustApply return receiver(*arguments, **named) File "/usr/local/lib/python3.7/site-packages/scrapy/extensions/feedexport.py", line 264, in item_scraped slot = self.slot AttributeError: 'FeedExporter' object has no attribute 'slot'
I'm exporting my results to AWS S3 like so in settings.py:
S3_BUCKET = "s3://my-bucket" S3_REGION = "eu-west-1" FEED_FORMAT = "csv" FEED_EXPORT_ENCODING = "utf-8" FEED_URI = S3_BUCKET + "/%(name)s/%(time)s.csv"
The spider runs as expected locally and did so previously on ScrapingHub as well.
Any clues on how to debug?
A clarification (as I wasn't able to edit the orignal post):
I wrote "...throws this exception on about a half of by requests", but should rather have written: "...throws this exception on upon storage of about half of by items".