Module Pymssql error

Posted over 7 years ago by Francisco Pires

Post a topic
Answered
F
Francisco Pires

Hi,

I have deployed my project bue when it runs, give me this error:


 

File "/app/python/lib/python2.7/site-packages/scrapy/commands/crawl.py", line 57, in run
	    self.crawler_process.crawl(spname, **opts.spargs)
	  File "/app/python/lib/python2.7/site-packages/scrapy/crawler.py", line 168, in crawl
	    return self._crawl(crawler, *args, **kwargs)
	  File "/app/python/lib/python2.7/site-packages/scrapy/crawler.py", line 172, in _crawl
	    d = crawler.crawl(*args, **kwargs)
	  File "/app/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1532, in unwindGenerator
	    return _inlineCallbacks(None, gen, Deferred())
	--- <exception caught here> ---
	  File "/app/python/lib/python2.7/site-packages/twisted/internet/defer.py", line 1386, in _inlineCallbacks
	    result = g.send(result)
	  File "/app/python/lib/python2.7/site-packages/scrapy/crawler.py", line 95, in crawl
	    six.reraise(*exc_info)
	  File "/app/python/lib/python2.7/site-packages/scrapy/crawler.py", line 77, in crawl
	    self.engine = self._create_engine()
	  File "/app/python/lib/python2.7/site-packages/scrapy/crawler.py", line 102, in _create_engine
	    return ExecutionEngine(self, lambda _: self.stop())
	  File "/app/python/lib/python2.7/site-packages/scrapy/core/engine.py", line 70, in __init__
	    self.scraper = Scraper(crawler)
	  File "/app/python/lib/python2.7/site-packages/scrapy/core/scraper.py", line 71, in __init__
	    self.itemproc = itemproc_cls.from_crawler(crawler)
	  File "/app/python/lib/python2.7/site-packages/scrapy/middleware.py", line 58, in from_crawler
	    return cls.from_settings(crawler.settings, crawler)
	  File "/app/python/lib/python2.7/site-packages/scrapy/middleware.py", line 34, in from_settings
	    mwcls = load_object(clspath)
	  File "/app/python/lib/python2.7/site-packages/scrapy/utils/misc.py", line 44, in load_object
	    mod = import_module(module)
	  File "/usr/lib/python2.7/importlib/__init__.py", line 37, in import_module
	    __import__(name)
	  File "/app/__main__.egg/Maiscarrinho/pipelines.py", line 9, in <module>
	    
	exceptions.ImportError: No module named pymssql

 

I have python 2.1.3 installed and my spider works fine when i run the script manually with cmd.


I tried to put pymssql==2.1.3 in the requirement .txt but it throws this when i try to deploy:


 

Packing version 1.0
Deploying to Scrapy Cloud project "218288"
Deploy log last 30 lines:
Installing collected packages: pymssql
  Running setup.py install for pymssql
    Complete output from command /usr/bin/python -c "import setuptools, tokenize;__file__='/tmp/pip-build-o5IHzj/pymssql/setup.py';exec(compile(getatt
r(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-HC2agm-record/install-record.txt --sin
gle-version-externally-managed --compile --user --prefix=:
    setup.py: platform.system() => 'Linux'
    setup.py: platform.architecture() => ('64bit', 'ELF')
    setup.py: platform.linux_distribution() => ('Ubuntu', '12.04', 'precise')
    setup.py: platform.libc_ver() => ('glibc', '2.4')
    setup.py: Not using bundled FreeTDS
    setup.py: include_dirs = ['/usr/local/include']
    setup.py: library_dirs = ['/usr/local/lib']
    running install
    running build
    running build_ext
    building '_mssql' extension
    creating build
    creating build/temp.linux-x86_64-2.7
    gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -I/usr/local/include -I/usr/include/python2.7 -c _mssql.
c -o build/temp.linux-x86_64-2.7/_mssql.o -DMSDBLIB
    _mssql.c:266:22: fatal error: sqlfront.h: No such file or directory
    compilation terminated.
    error: command 'gcc' failed with exit status 1

 

Someone can help me?
Thanks

0 Votes

vaz

vaz posted over 7 years ago Best Answer

Hi, did you tried using this approach: https://helpdesk.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud

If so, and didn't work perhaps that means that you need to deploy your own docker image. Please check this article for further information: 
https://shub.readthedocs.io/en/stable/deploy-custom-image.html#deploy-custom-image


Best regards,


Pablo

0 Votes


1 Comments

vaz

vaz posted over 7 years ago Answer

Hi, did you tried using this approach: https://helpdesk.scrapinghub.com/support/solutions/articles/22000200400-deploying-python-dependencies-for-your-projects-in-scrapy-cloud

If so, and didn't work perhaps that means that you need to deploy your own docker image. Please check this article for further information: 
https://shub.readthedocs.io/en/stable/deploy-custom-image.html#deploy-custom-image


Best regards,


Pablo

0 Votes

Login to post a comment