ArgumentParser has no attribute add_option while deploying spider

Posted over 2 years ago by hienpham15

Post a topic
Un Answered
h
hienpham15

Step 3/3 : RUN chmod +x /kumo-entrypoint

---> Using cache

---> deb5c064fea7

Successfully built deb5c064fea7

Successfully tagged kumo-entrypoint:latest

Entrypoint container is created successfully

>>> Checking python dependencies

Collecting pip<20.0,>=9.0.3

Downloading pip-19.3.1-py2.py3-none-any.whl (1.4 MB)

Installing collected packages: pip

Successfully installed pip-19.3.1

No broken requirements found.

WARNING: There're some errors when doing pip-check:

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

WARNING: Ignoring invalid distribution -main- (/app/__main__.egg)

WARNING: The scripts pip, pip3 and pip3.8 are installed in '/app/python/bin' which is not on PATH.

Consider adding this directory to PATH or, if you prefer to suppress this warning, use --no-warn-script-location.

>>> Getting spiders list:

>>> Trying to get spiders from shub-image-info command

WARNING: There're some errors on shub-image-info call:

Addon import error scrapy_pagestorage.

PageStorageMiddleware: cannot import name 'DictItem' from 'scrapy.item' (/app/python/lib/python3.8/site-packages/scrapy/item.py)

ERROR:root:Job runtime exception

Traceback (most recent call last):

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings)

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/app/python/lib/python3.8/site-packages/scrapy/cmdline.py", line 140, in execute cmd.add_options(parser)

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/commands/shub_image_info.py",

line 23, in add_options parser.add_option("--debug", action="store_true",

AttributeError: 'ArgumentParser' object has no attribute 'add_option'

Traceback (most recent call last):

File "/usr/local/bin/shub-image-info", line 8, in <module> sys.exit(shub_image_info())

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 209, in shub_image_info _run_usercode(None, ['scrapy', 'shub_image_info'] + sys.argv[1:], File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings)

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings)

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings)

File "/app/python/lib/python3.8/site-packages/scrapy/cmdline.py", line 140, in execute cmd.add_options(parser)

File "/usr/local/lib/python3.8/site-packages/sh_scrapy/commands/shub_image_info.py", line 23, in add_options parser.add_option("--debug", action="store_true",

AttributeError: 'ArgumentParser' object has no attribute 'add_option' {"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}


I'm having this error while deploying a spider. Please help

2 Votes


5 Comments

Sorted by
R

Rami Maalouf posted over 1 year ago

After centuries of digging the reason it would break is because I had Scrapy in my requirements.txt. All I had to do was remove that line. But I believe you should also remove all the packages that are alread built into Scrapy. You can find that list for scrapy-v2.8 here. Noticed that because I read this in the docs:
"Note that this requirements file is an extension of the Scrapy Cloud stack, and therefore should not contain packages that are already part of the stack, such as scrapy."(source: https://shub.readthedocs.io/en/latest/deploying.html)

0 Votes

R

Rami Maalouf posted over 1 year ago

I'm having the same issue and I'm not working with any containers or images

0 Votes

R

Ricard Falcó posted about 2 years ago

I am having similar problems, I don't understand if it happens to you in the same case.

In my situation I get this error when trying to deploy a custom image.

And I don't understand if it is related to the stack assigned in the docker file of the custom image.

In my case I put the following one:

FROM scrapinghub/scrapinghub-stack-scrapy:2.4-latest



Please let me know if this happens to you because you are working with custom images or with simple deploy.

(If it happens just with a normal deploy I would try to update the shub libraries)

0 Votes

c

chaim rydlewicz posted about 2 years ago

I am having the same problem.

0 Votes

L

Luiz Batista posted about 2 years ago

hi, you did solve this problem ?

0 Votes

Login to post a comment