In my local machine everything works fine, but when I deployed it on ScrapingHub I've got an error saying all "ImportError: No module named mysql.connector".
All I need is to, whenever I run my spider or run through job schedule it will automatically add all the scraped items through my database.
Also I am trying to use item API if I don't have a choice to solve this issue
supportgpa
In my local machine everything works fine, but when I deployed it on ScrapingHub I've got an error saying all "ImportError: No module named mysql.connector".
All I need is to, whenever I run my spider or run through job schedule it will automatically add all the scraped items through my database.
Also I am trying to use item API if I don't have a choice to solve this issue
Please help thank you!
mysql connector is not installed by default in Scrapy Cloud, you need to install it by adding it to your python dependencies requirements.txt https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud
nestor
mysql connector is not installed by default in Scrapy Cloud, you need to install it by adding it to your python dependencies requirements.txt https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 438 topics