No recent searches
Popular Articles
Sorry! nothing found for
Posted almost 6 years ago by Yingke Yu
Hi,
If I want to get one unique Id for each scrapy job I run, and use that Id as part of the item I yield which gets dump to the database after scraping.
Is this possible and what will be the id?
0 Votes
nestor posted almost 6 years ago Admin Best Answer
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
1 Comments
nestor posted almost 6 years ago Admin Answer
Login to post a comment
People who like this
This post will be deleted permanently. Are you sure?
Hi,
If I want to get one unique Id for each scrapy job I run, and use that Id as part of the item I yield which gets dump to the database after scraping.
Is this possible and what will be the id?
0 Votes
nestor posted almost 6 years ago Admin Best Answer
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
0 Votes
1 Comments
nestor posted almost 6 years ago Admin Answer
There's an ENV variable SHUB_JOBKEY, in the format "project_id/spider_id/job_id"
There's also job_key in the Python Scrapinghub client https://python-scrapinghub.readthedocs.io/en/latest/client/apidocs.html
0 Votes
Login to post a comment