shub deploy fails and github not showing repository

Posted about 7 years ago by zackrall

Post a topic
Un Answered
z
zackrall

Deploy log last 30 lines:

 File "<frozen importlib._bootstrap>", line 948, in _find_and_load_unlocked

ModuleNotFoundError: No module named 'test_com_scraper'

Traceback (most recent call last):

 File "/usr/local/bin/shub-image-info", line 11, in <module>

 sys.exit(shub_image_info())

 File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 210, in shub_image_info

 _get_apisettings, commands_module='sh_scrapy.commands')

 File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 138, in _run_usercode

 settings = populate_settings(apisettings_func(), spider)

 File "/usr/local/lib/python3.6/site-packages/sh_scrapy/settings.py", line 235, in populate_settings

 return _populate_settings_base(apisettings, _load_default_settings, spider)

 File "/usr/local/lib/python3.6/site-packages/sh_scrapy/settings.py", line 164, in _populate_settings_base

 settings = get_project_settings().copy()

 File "/usr/local/lib/python3.6/site-packages/scrapy/utils/project.py", line 68, in get_project_settings

 settings.setmodule(settings_module_path, priority='project')

 File "/usr/local/lib/python3.6/site-packages/scrapy/settings/__init__.py", line 292, in setmodule

 module = import_module(module)

 File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in import_module

 return _bootstrap._gcd_import(name[level:], package, level)

 File "<frozen importlib._bootstrap>", line 978, in _gcd_import

 File "<frozen importlib._bootstrap>", line 961, in _find_and_load

 File "<frozen importlib._bootstrap>", line 936, in _find_and_load_unlocked

 File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed

 File "<frozen importlib._bootstrap>", line 978, in _gcd_import

 File "<frozen importlib._bootstrap>", line 961, in _find_and_load

 File "<frozen importlib._bootstrap>", line 948, in _find_and_load_unlocked

ModuleNotFoundError: No module named 'test_com_scraper'

{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}


{"message": "Internal error", "status": "error"}

Deploy log location: /var/folders/8t/tjlf_p617bx5pdhklmp7npj80000gn/T/shub_deploy_c6b4gdvn.log

Error: Deploy failed: b'{"message": "Internal error", "status": "error"}'


Everything I've seen in help requests has to do with "Internal build error" or the requirements.txt file (which I have for dependencies). This seems to be referencing the setup.py file 

entry_points = {'scrapy': ['settings = test_com_scraper.settings']}

If I change it to com.settings then the No module error above says No module named 'com'



Also, I've connected to my github account and allowed access to private repositories. However, these aren't showing up on scrapinghub. When I create a public repository, it shows up fine.

0 Votes


1 Comments

Y

Yeshwanth Bheempad posted over 1 year ago

did u resolve errors?



0 Votes

Login to post a comment