Hi Guys, I deployed my spider project 289804 from my python 3.6 and its giving me an strange error .ImportError: No module named scrapy_jsonrpc.webservice
and why is showing me unusual locations containing python27 path.I've attached a file
I have made the changes you've suggested me, but still getting the following error, help me I'm just a noob.
[root] Job runtime exception Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute _run_print_help(parser, _run_command, cmd, args, opts) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help func(*a, **kw) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command cmd.run(args, opts) File "/usr/local/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run self.crawler_process.crawl(spname, **opts.spargs) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 170, in crawl crawler = self.create_crawler(crawler_or_spidercls) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 198, in create_crawler return self._create_crawler(crawler_or_spidercls) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 203, in _create_crawler return Crawler(spidercls, self.settings) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 55, in __init__ self.extensions = ExtensionManager.from_crawler(self) File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 58, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 34, in from_settings mwcls = load_object(clspath) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/misc.py", line 44, in load_object mod = import_module(module) File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 941, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked ModuleNotFoundError: No module named 'scrapy_jsonrpc'
0 Votes
nestorposted
almost 7 years ago
AdminAnswer
Probably cause you deployed a python 3.6 spider with a python 2.7 stack.
Hi Guys, I deployed my spider project 289804 from my python 3.6 and its giving me an strange error .ImportError: No module named scrapy_jsonrpc.webservice
and why is showing me unusual locations containing python27 path.I've attached a file
Attachments (1)
query.PNG
109 KB
0 Votes
nestor posted almost 7 years ago Admin Best Answer
Probably cause you deployed a python 3.6 spider with a python 2.7 stack.
Check the available stacks and change the environment following this guide: https://support.scrapinghub.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
0 Votes
3 Comments
nestor posted almost 7 years ago Admin
The error is detailed in the last lines of the traceback, you're trying to import a module "scrapy_jsonrpc" which is not present in the stack.
If your project requires additional Python dependencies, you need to add them to a requirements.txt before deploying. Please follow this guide: https://support.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud
0 Votes
engnrhamza posted almost 7 years ago
I have made the changes you've suggested me, but still getting the following error, help me I'm just a noob.
[root] Job runtime exception Traceback (most recent call last): File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 148, in _run_usercode _run(args, settings) File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 103, in _run _run_scrapy(args, settings) File "/usr/local/lib/python3.6/site-packages/sh_scrapy/crawl.py", line 111, in _run_scrapy execute(settings=settings) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 150, in execute _run_print_help(parser, _run_command, cmd, args, opts) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 90, in _run_print_help func(*a, **kw) File "/usr/local/lib/python3.6/site-packages/scrapy/cmdline.py", line 157, in _run_command cmd.run(args, opts) File "/usr/local/lib/python3.6/site-packages/scrapy/commands/crawl.py", line 57, in run self.crawler_process.crawl(spname, **opts.spargs) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 170, in crawl crawler = self.create_crawler(crawler_or_spidercls) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 198, in create_crawler return self._create_crawler(crawler_or_spidercls) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 203, in _create_crawler return Crawler(spidercls, self.settings) File "/usr/local/lib/python3.6/site-packages/scrapy/crawler.py", line 55, in __init__ self.extensions = ExtensionManager.from_crawler(self) File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 58, in from_crawler return cls.from_settings(crawler.settings, crawler) File "/usr/local/lib/python3.6/site-packages/scrapy/middleware.py", line 34, in from_settings mwcls = load_object(clspath) File "/usr/local/lib/python3.6/site-packages/scrapy/utils/misc.py", line 44, in load_object mod = import_module(module) File "/usr/local/lib/python3.6/importlib/__init__.py", line 126, in import_module return _bootstrap._gcd_import(name[level:], package, level) File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 941, in _find_and_load_unlocked File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed File "<frozen importlib._bootstrap>", line 994, in _gcd_import File "<frozen importlib._bootstrap>", line 971, in _find_and_load File "<frozen importlib._bootstrap>", line 953, in _find_and_load_unlocked ModuleNotFoundError: No module named 'scrapy_jsonrpc'
0 Votes
nestor posted almost 7 years ago Admin Answer
Probably cause you deployed a python 3.6 spider with a python 2.7 stack.
Check the available stacks and change the environment following this guide: https://support.scrapinghub.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
0 Votes
Login to post a comment