Error deploying

Posted about 7 years ago by Marcos Machado

Post a topic
Un Answered
M
Marcos Machado


I'm getting this error when I try deploy a spider by shub command line. I can run the spider on my machine with no problem. I installed shub, created the yml file (with requirements.txt and project ID) and tried deploy to the scrapinghub.


Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute cmd.crawler_process = CrawlerProcess(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__ super(CrawlerProcess, self).__init__(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__ self.spider_loader = _get_spider_loader(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings return cls(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__ self._load_all_spiders() File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders for module in walk_modules(name): File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 63, in walk_modules mod = import_module(path) File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) ImportError: No module named spiders Traceback (most recent call last): File "/usr/local/bin/shub-image-info", line 11, in <module> sys.exit(shub_image_info()) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 210, in shub_image_info _get_apisettings, commands_module='sh_scrapy.commands') File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute cmd.crawler_process = CrawlerProcess(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__ super(CrawlerProcess, self).__init__(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__ self.spider_loader = _get_spider_loader(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings return cls(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__ self._load_all_spiders() File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders for module in walk_modules(name): File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 63, in walk_modules mod = import_module(path) File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) ImportError: No module named spiders {"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}

0 Votes


2 Comments

Sorted by
nestor

nestor posted about 7 years ago Admin

Please refer to this article: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud, seems like you're trying to import a module named spiders.


What does your settings.py look like?

0 Votes

M

Marcos Machado posted almost 7 years ago

my settings file:


BOT_NAME= 'coletores'

SPIDER_MODULES = ['coletores.spiders']

NEWSPIDER_MODULE = 'coletores.spiders'

RETRY_ENABLE = True

RETRY_TIMES = 2

RETRY_HTTP_CODES = [403]

RETRY_PRIORITY_ADJUST = -1


After I created a new project and put the spiders in the new project folder the deploy has been working.. after I restarted the machine the some error returned :/



0 Votes

Login to post a comment