Your service is failing to setup my GitHub integration so I cannot deploy code from GitHub. I realized this was actually because the emails were different from each other which seems a bit silly. But now I realized I need to have only a single Scrapy project in a repo? My org has a mono repo... So I would need to redo everything. Also your link to remedy a missing an organization doesn't help with that at all.
When I use the stub tool, I get errors that say it can't find certain packages.
shub deploy 558813
Packing version 9b049cb-main
Deploying to Scrapy Cloud project "558813"
Deploy log last 30 lines:
execute(settings=settings)
File "/usr/local/lib/python3.8/site-packages/scrapy/cmdline.py", line 144, in execute
cmd.crawler_process = CrawlerProcess(settings)
File "/usr/local/lib/python3.8/site-packages/scrapy/crawler.py", line 265, in __init__
super(CrawlerProcess, self).__init__(settings)
File "/usr/local/lib/python3.8/site-packages/scrapy/crawler.py", line 137, in __init__
self.spider_loader = _get_spider_loader(settings)
File "/usr/local/lib/python3.8/site-packages/scrapy/crawler.py", line 345, in _get_spider_loader
return loader_cls.from_settings(settings.frozencopy())
File "/usr/local/lib/python3.8/site-packages/scrapy/spiderloader.py", line 60, in from_settings
return cls(settings)
File "/usr/local/lib/python3.8/site-packages/scrapy/spiderloader.py", line 24, in __init__
self._load_all_spiders()
File "/usr/local/lib/python3.8/site-packages/scrapy/spiderloader.py", line 46, in _load_all_spiders
for module in walk_modules(name):
File "/usr/local/lib/python3.8/site-packages/scrapy/utils/misc.py", line 77, in walk_modules
submod = import_module(fullpath)
File "/usr/local/lib/python3.8/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1014, in _gcd_import
File "<frozen importlib._bootstrap>", line 991, in _find_and_load
File "<frozen importlib._bootstrap>", line 975, in _find_and_load_unlocked
File "<frozen importlib._bootstrap>", line 655, in _load_unlocked
File "<frozen importlib._bootstrap>", line 618, in _load_backward_compatible
File "<frozen zipimport>", line 259, in load_module
File "/app/__main__.egg/keybase/spiders/spider.py", line 6, in <module>
ModuleNotFoundError: No module named 'fake_useragent'
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}
{"status": "error", "message": "Internal error"}
Deploy log location: /var/folders/rl/c9bdswnx23q1y9pfd8pf6jsr0000gn/T/shub_deploy_mgip91y4.log
Error: Deploy failed: b'{"status": "error", "message": "Internal error"}'
I want to like this tool, but there are so many errors, I can't even deploy my code!
Your service is failing to setup my GitHub integration so I cannot deploy code from GitHub. I realized this was actually because the emails were different from each other which seems a bit silly. But now I realized I need to have only a single Scrapy project in a repo? My org has a mono repo... So I would need to redo everything. Also your link to remedy a missing an organization doesn't help with that at all.
When I use the stub tool, I get errors that say it can't find certain packages.
I want to like this tool, but there are so many errors, I can't even deploy my code!
0 Votes
0 Comments
Login to post a comment