I have used which in my settings as:
#selenium
from shutil import which
SELENIUM_DRIVER_NAME = 'chrome'
SELENIUM_DRIVER_EXECUTABLE_PATH = which('chromedriver')
SELENIUM_DRIVER_ARGUMENTS=['-headless']
requirements.txt file
guppy==0.1.10
selenium>=3.9.0
scrapy-selenium==0.0.7
dependencies on scrapinghub.yml
project: 434717
requirements:
file: requirements.txt
I got error:
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/settings.py", line 172, in _populate_settings_base
settings = get_project_settings().copy()
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/usr/local/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 288, in setmodule
module = import_module(module)
File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/__main__.egg/andorra/settings.py", line 94, in <module>
ImportError: cannot import name which
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}
{"status": "error", "message": "Internal error"}
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
Shahzaib Saifullah
I have used which in my settings as:
#selenium
from shutil import which
SELENIUM_DRIVER_NAME = 'chrome'
SELENIUM_DRIVER_EXECUTABLE_PATH = which('chromedriver')
SELENIUM_DRIVER_ARGUMENTS=['-headless']
requirements.txt file
guppy==0.1.10
selenium>=3.9.0
scrapy-selenium==0.0.7
dependencies on scrapinghub.yml
project: 434717
requirements:
file: requirements.txt
I got error:
File "/usr/local/lib/python2.7/site-packages/sh_scrapy/settings.py", line 172, in _populate_settings_base
settings = get_project_settings().copy()
File "/usr/local/lib/python2.7/site-packages/scrapy/utils/project.py", line 68, in get_project_settings
settings.setmodule(settings_module_path, priority='project')
File "/usr/local/lib/python2.7/site-packages/scrapy/settings/__init__.py", line 288, in setmodule
module = import_module(module)
File "/usr/local/lib/python2.7/importlib/__init__.py", line 37, in import_module
__import__(name)
File "/app/__main__.egg/andorra/settings.py", line 94, in <module>
ImportError: cannot import name which
{"message": "shub-image-info exit code: 1", "details": null, "error": "image_info_error"}
{"status": "error", "message": "Internal error"}
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
nestor
You need to deploy a custom Docker image to be able to deploy a project that uses Selenium, see: https://support.scrapinghub.com/support/solutions/articles/22000240310-deploying-custom-docker-image-with-selenium-on-scrapy-cloud
Note that you would need to subscribe to at least 1 Scrapy Cloud unit to be able to deploy custom Docker images.
-
Unable to select Scrapy project in GitHub
-
ScrapyCloud can't call spider?
-
Unhandled error in Deferred
-
Item API - Filtering
-
newbie to web scraping but need data from zillow
-
ValueError: Invalid control character
-
Cancelling account
-
Best Practices
-
Beautifulsoup with ScrapingHub
-
Delete a project in ScrapingHub
See all 460 topics