URGENT - data moved into deleted area 1 hour after turning off cloud unit
C
Clayton Drazner
started a topic
about 6 years ago
I'm winding down a project and at 2017-10-04 05:48:00 UTC turned off my scrapy cloud container units for my account. I went to download a bunch of the jobs from the last ~1-2 months using the python api, and noticed I was getting an HTTP unauthorized response.
Looking at the activity log, I see that at all of my jobs were moved into the "deleted" area at 2017-10-04 06:47:55 UTC which was only an hour afterwards. This is incredibly frustrating since the data retention policy was supposed to be at least a week even in the free tier. Plus, I even went back and added another scrapy cloud container unit but there's no way to apply the expanded 120 day data retention I'm now paying for to the jobs stuck in the delete area!
Is there any way to at give me access to at least download the data off your servers using the jobs api before it actually gets deleted? (I know the data isn't completely gone yet since I'm still able to download individual items form these jobs).
Best Answer
C
Clayton Drazner
said
about 6 years ago
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it. Sorry for the false alarm!
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it. Sorry for the false alarm!
Clayton Drazner
I'm winding down a project and at 2017-10-04 05:48:00 UTC turned off my scrapy cloud container units for my account. I went to download a bunch of the jobs from the last ~1-2 months using the python api, and noticed I was getting an HTTP unauthorized response.
Looking at the activity log, I see that at all of my jobs were moved into the "deleted" area at 2017-10-04 06:47:55 UTC which was only an hour afterwards. This is incredibly frustrating since the data retention policy was supposed to be at least a week even in the free tier. Plus, I even went back and added another scrapy cloud container unit but there's no way to apply the expanded 120 day data retention I'm now paying for to the jobs stuck in the delete area!
Is there any way to at give me access to at least download the data off your servers using the jobs api before it actually gets deleted? (I know the data isn't completely gone yet since I'm still able to download individual items form these jobs).
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it.
Sorry for the false alarm!
- Oldest First
- Popular
- Newest First
Sorted by Oldest FirstClayton Drazner
*items api
Clayton Drazner
Update: things are working now, I must have been typing my api key in wrongly or something. So even though it was deleted really quickly I'm still able to download (at least a large chunk of) it.
Sorry for the false alarm!
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 446 topics