No recent searches
Popular Articles
Sorry! nothing found for
Posted almost 5 years ago by Shahzaib Saifullah
I have used which in my settings as:
#selenium
from shutil import which
SELENIUM_DRIVER_NAME = 'chrome'
SELENIUM_DRIVER_EXECUTABLE_PATH = which('chromedriver')
SELENIUM_DRIVER_ARGUMENTS=['-headless']
requirements.txt file
guppy==0.1.10
selenium>=3.9.0
scrapy-selenium==0.0.7
dependencies on scrapinghub.yml
project: 434717
requirements:
file: requirements.txt
I got error:
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/settings.py", line 172, in _populate_settings_base
settings = get_project_settings().copy()
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/usr/local/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 288, in setmodule
module = import_module(module)
File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/__main__.egg/andorra/settings.py", line 94, in <module>
ImportError: cannot import name which
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}
{"status": "error", "message": "Internal error"}
0 Votes
nestor posted almost 5 years ago Admin Best Answer
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
1 Comments
nestor posted almost 5 years ago Admin Answer
Login to post a comment
People who like this
This post will be deleted permanently. Are you sure?
I have used which in my settings as:
#selenium
from shutil import which
SELENIUM_DRIVER_NAME = 'chrome'
SELENIUM_DRIVER_EXECUTABLE_PATH = which('chromedriver')
SELENIUM_DRIVER_ARGUMENTS=['-headless']
requirements.txt file
guppy==0.1.10
selenium>=3.9.0
scrapy-selenium==0.0.7
dependencies on scrapinghub.yml
project: 434717
requirements:
file: requirements.txt
I got error:
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/settings.py", line 172, in _populate_settings_base
settings = get_project_settings().copy()
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/usr/local/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 288, in setmodule
module = import_module(module)
File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/__main__.egg/andorra/settings.py", line 94, in <module>
ImportError: cannot import name which
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}
{"status": "error", "message": "Internal error"}
0 Votes
nestor posted almost 5 years ago Admin Best Answer
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
0 Votes
1 Comments
nestor posted almost 5 years ago Admin Answer
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
0 Votes
Login to post a comment