shub deploy project fails - shub-image-info command is not found

Posted over 7 years ago by Simon Schneckenberger

Post a topic
Answered
S
Simon Schneckenberger

Hi guys,


when i use the "shub deploy" command on my mac terminal i get the following error message:
...

>>> Checking python dependencies

No broken requirements found.

>>> Getting spiders list:

>>> Trying to get spiders from shub-image-info command

shub-image-info command is not found

>>> Trying to get spiders from list-spiders command

WARNING: There're some errors when listing spiders:

{"message": "", "error": "internal_error"}


{"status": "error", "message": "Internal error"}

Deploy log location: /var/folders/36/0m5hmwbx0lq1prnk93fglkwh0000gn/T/shub_deploy_jzY10G.log

Error: Deploy failed: {"status": "error", "message": "Internal error"}


It actually worked 2 days ago but since last night i get this message. Any clue how to fix this?

Many thanks in advance!

0 Votes

vaz

vaz posted over 7 years ago Best Answer

Hi,


I've seen you have successfully deployed on your project 226008.


Let us know if you have further questions.


Best,


Pablo

0 Votes


5 Comments

Sorted by
S

Saatvik Ramisetty posted about 5 years ago

Facing the same issue

0 Votes

p

pedro_leibent posted over 5 years ago

tengo el mismo problema, y no sé como solucionar:


Packing version 1.0

Deploying to Scrapy Cloud project "400328"

Deploy log last 30 lines:

Removing intermediate container 98e07c23cbaf

 ---> 707e7463b889

Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin

 ---> Running in fd3990fb2d78

Removing intermediate container fd3990fb2d78

 ---> 7cf140305499

Successfully built 7cf140305499

Successfully tagged i.scrapinghub.com/kumo_project/400328:3

Step 1/3 : FROM alpine:3.5

 ---> f80194ae2e0c

Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint

 ---> Using cache

 ---> bcc792fa8603

Step 3/3 : RUN chmod +x /kumo-entrypoint

 ---> Using cache

 ---> 204e6cba3933

Successfully built 204e6cba3933

Successfully tagged kumo-entrypoint:latest

Entrypoint container is created successfully

>>> Checking python dependencies

No broken requirements found.

WARNING: There're some errors when doing pip-check:

DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.

DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. Please upgrade your Python as Python 2.7 won't be maintained after that date. A future version of pip will drop support for Python 2.7.

>>> Getting spiders list:

>>> Trying to get spiders from shub-image-info command

WARNING: There're some errors on shub-image-info call:

{"message": "", "error": "internal_error"}


{"status": "error", "message": "Internal error"}

Deploy log location: /tmp/shub_deploy_r7md9t5z.log

Error: Deploy failed: b'{"status": "error", "message": "Internal error"}'


1 Votes

A

Anupam Ramanna posted about 7 years ago

How did you guys fix this error? Please help. We are facing similar issue.


No broken requirements found. >>> Getting spiders list: >>> Trying to get spiders from shub-image-info command WARNING: There're some errors on shub-image-info call: ERROR:root:Job runtime exception Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute cmd.crawler_process = CrawlerProcess(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__ super(CrawlerProcess, self).__init__(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__ self.spider_loader = _get_spider_loader(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings return cls(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__ self._load_all_spiders() File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders for module in walk_modules(name): File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 71, in walk_modules submod = import_module(fullpath) File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) File "/app/__main__.egg/all/spiders/youtube_spider.py", line 41, in <module> File "/app/__main__.egg/all/spiders/youtube_spider.py", line 44, in HalkSpider EOFError: EOF when reading a line Traceback (most recent call last): File "/usr/local/bin/shub-image-info", line 11, in <module> sys.exit(shub_image_info()) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 210, in shub_image_info _get_apisettings, commands_module='sh_scrapy.commands') File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python2.7/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python2.7/site-packages/scrapy/cmdline.py", line 148, in execute cmd.crawler_process = CrawlerProcess(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 243, in __init__ super(CrawlerProcess, self).__init__(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 134, in __init__ self.spider_loader = _get_spider_loader(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/crawler.py", line 330, in _get_spider_loader return loader_cls.from_settings(settings.frozencopy()) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 61, in from_settings return cls(settings) File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 25, in __init__ self._load_all_spiders() File "/usr/local/lib/python2.7/site-packages/scrapy/spiderloader.py", line 47, in _load_all_spiders for module in walk_modules(name): File "/usr/local/lib/python2.7/site-packages/scrapy/utils/misc.py", line 71, in walk_modules submod = import_module(fullpath) File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module __import__(name) File "/app/__main__.egg/all/spiders/youtube_spider.py", line 41, in <module> File "/app/__main__.egg/all/spiders/youtube_spider.py", line 44, in HalkSpider EOFError: EOF when reading a line {"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}

0 Votes

z

zackrall posted about 7 years ago

Hi what did you do to fix this issue?

0 Votes

vaz

vaz posted over 7 years ago Answer

Hi,


I've seen you have successfully deployed on your project 226008.


Let us know if you have further questions.


Best,


Pablo

0 Votes

Login to post a comment