I'm winding down a project and at 2017-10-04 05:48:00 UTC turned off my scrapy cloud container units for my account. I went to download a bunch of the jobs from the last ~1-2 months using the python api, and noticed I was getting an HTTP unauthorized response.
Looking at the activity log, I see that at all of my jobs were moved into the "deleted" area at 2017-10-04 06:47:55 UTC which was only an hour afterwards. This is incredibly frustrating since the data retention policy was supposed to be at least a week even in the free tier. Plus, I even went back and added another scrapy cloud container unit but there's no way to apply the expanded 120 day data retention I'm now paying for to the jobs stuck in the delete area!
Is there any way to at give me access to at least download the data off your servers using the jobs api before it actually gets deleted? (I know the data isn't completely gone yet since I'm still able to download individual items form these jobs).
0 Votes
C
Clayton Drazner posted
about 7 years ago
Best Answer
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it. Sorry for the false alarm!
0 Votes
2 Comments
Sorted by
C
Clayton Draznerposted
about 7 years ago
Answer
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it. Sorry for the false alarm!
I'm winding down a project and at 2017-10-04 05:48:00 UTC turned off my scrapy cloud container units for my account. I went to download a bunch of the jobs from the last ~1-2 months using the python api, and noticed I was getting an HTTP unauthorized response.
Looking at the activity log, I see that at all of my jobs were moved into the "deleted" area at 2017-10-04 06:47:55 UTC which was only an hour afterwards. This is incredibly frustrating since the data retention policy was supposed to be at least a week even in the free tier. Plus, I even went back and added another scrapy cloud container unit but there's no way to apply the expanded 120 day data retention I'm now paying for to the jobs stuck in the delete area!
Is there any way to at give me access to at least download the data off your servers using the jobs api before it actually gets deleted? (I know the data isn't completely gone yet since I'm still able to download individual items form these jobs).
0 Votes
Clayton Drazner posted about 7 years ago Best Answer
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it.
Sorry for the false alarm!
0 Votes
2 Comments
Clayton Drazner posted about 7 years ago Answer
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it.
Sorry for the false alarm!
0 Votes
Clayton Drazner posted about 7 years ago
*items api
0 Votes
Login to post a comment