Hi,
If I want to get one unique Id for each scrapy job I run, and use that Id as part of the item I yield which gets dump to the database after scraping.
Is this possible and what will be the id?
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
Yingke Yu
Hi,
If I want to get one unique Id for each scrapy job I run, and use that Id as part of the item I yield which gets dump to the database after scraping.
Is this possible and what will be the id?
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
nestor
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 446 topics