In my local machine everything works fine, but when I deployed it on ScrapingHub I've got an error saying all "ImportError: No module named mysql.connector".
All I need is to, whenever I run my spider or run through job schedule it will automatically add all the scraped items through my database.
Also I am trying to use item API if I don't have a choice to solve this issue
In my local machine everything works fine, but when I deployed it on ScrapingHub I've got an error saying all "ImportError: No module named mysql.connector".
All I need is to, whenever I run my spider or run through job schedule it will automatically add all the scraped items through my database.
Also I am trying to use item API if I don't have a choice to solve this issue
Please help thank you!
0 Votes
nestor posted over 5 years ago Admin Best Answer
mysql connector is not installed by default in Scrapy Cloud, you need to install it by adding it to your python dependencies requirements.txt https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud
0 Votes
1 Comments
nestor posted over 5 years ago Admin Answer
mysql connector is not installed by default in Scrapy Cloud, you need to install it by adding it to your python dependencies requirements.txt https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud
0 Votes
Login to post a comment