Shub Deploy Fails

Posted about 7 years ago by manoj kumar

Post a topic
Answered
m
manoj kumar

Deploying to Scrapy Cloud project "242717"

Deploy log last 30 lines:

    sys.exit(list_spiders())

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 170, in

 list_spiders

    _run_usercode(None, ['scrapy', 'list'], _get_apisettings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 127, in

 _run_usercode

    _run(args, settings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 87, in

_run

    _run_scrapy(args, settings)

  File "/usr/local/lib/python2.7/dist-packages/sh_scrapy/crawl.py", line 95, in

_run_scrapy

    execute(settings=settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/cmdline.py", line 142, in

execute

    cmd.crawler_process = CrawlerProcess(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 209, in

__init__

    super(CrawlerProcess, self).__init__(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 115, in

__init__

    self.spider_loader = _get_spider_loader(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/crawler.py", line 296, in

_get_spider_loader

    return loader_cls.from_settings(settings.frozencopy())

  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py", line 30,

 in from_settings

    return cls(settings)

  File "/usr/local/lib/python2.7/dist-packages/scrapy/spiderloader.py", line 21,

 in __init__

    for module in walk_modules(name):

  File "/usr/local/lib/python2.7/dist-packages/scrapy/utils/misc.py", line 71, i

n walk_modules

    submod = import_module(fullpath)

  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module

    __import__(name)

  File "/app/__main__.egg/pdfbot/spiders/scraper_spider_all.py", line 9, in <mod

ule>

ImportError: No module named PyPDF2

{"message": "list-spiders exit code: 1", "details": null, "error": "list_spiders

_error"}

 

{"status": "error", "message": "Internal error"}

Deploy log location: c:\users\manoj.k\appdata\local\temp\shub_deploy_etd28r.log

Error: Deploy failed: {"status": "error", "message": "Internal error"}

Finding the above error while Shub Deploying the working Script of PDF Extracted link....

Kindly help me out, How to deploy the pdf extracted items into scrapinghub.com


Regards

Manoj

0 Votes

nestor

nestor posted about 7 years ago Admin Best Answer

Python dependencies need to be added to the requirements.txt before deploying using shub. Please see: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud

1 Votes


3 Comments

Sorted by
g

giuseppe serra posted almost 1 year ago

(1) First, your https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud is now a broken link.

(2) $ shub deploy 112233 command is keep failing!
I tried to fix all packages with the right version,but after the dependency check is failing.
``` 

Packing version 1.0

Deploying to Scrapy Cloud project "730854"

Deploy log last 30 lines:

Step 1/3 : FROM alpine:3.5

 ---> f80194ae2e0c

Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint

 ---> Using cache

 ---> 5f37ed56a741

Step 3/3 : RUN chmod +x /kumo-entrypoint

 ---> Using cache

 ---> 7d9a72060af3

Successfully built 7d9a72060af3

Successfully tagged kumo-entrypoint:latest

Entrypoint container is created successfully

>>> Checking python dependencies

Collecting pip<20.0,>=9.0.3

  Downloading pip-19.3.1-py2.py3-none-any.whl (1.4 MB)

Installing collected packages: pip

Successfully installed pip-19.3.1

botocore 1.16.14 has requirement jmespath<1.0.0,>=0.7.1, but you have jmespath 1.0.1.

botocore 1.16.14 has requirement urllib3<1.26,>=1.20; python_version != "3.4", but you have urllib3 2.1.0.

boto3 1.13.14 has requirement jmespath<1.0.0,>=0.7.1, but you have jmespath 1.0.1.

awscli 1.18.64 has requirement PyYAML<5.4,>=3.10; python_version != "3.4", but you have pyyaml 6.0.1.

Warning: Pip checks failed, please fix the conflicts.

WARNING: There're some errors when doing pip-check:

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

  WARNING: The scripts pip, pip3 and pip3.8 are installed in '/app/python/bin' which is not on PATH.

  Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.

{"error": "requirements_error", "message": "Dependencies check exit code: 1", "details": "Pip checks failed, please fix the conflicts"}

{"status": "error", "message": "Requirements error"}

Deploy log location: /tmp/shub_deploy_9d624l82.log

Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'

```

Tried conda but no joy!
Please let me know guys what to do here otherwise I have to give up and keep looking around for some alternative.

 

1 Votes

m

manoj kumar posted about 7 years ago

Hi Nestor,

Thank you... I got the solution by your help...

0 Votes

nestor

nestor posted about 7 years ago Admin Answer

Python dependencies need to be added to the requirements.txt before deploying using shub. Please see: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud

1 Votes

Login to post a comment