Start a new topic

Selenium webdriver, chromedriver and chrome with docker image failed by crashing Chrome

I have the following error after configuring a custom docker image for my spider


selenium.common.exceptions.WebDriverException: Message: unknown error: Chrome failed to start: crashed.
	  (unknown error: DevToolsActivePort file doesn't exist) 

(The process started from chrome location /usr/bin/google-chrome is no longer running, so ChromeDriver is assuming that Chrome has crashed.) 


I need help, have been struggling with this for the past one month with a paid account on Zyte Scrapy Cloud. I have attached my spider, dockerfile and requirement file. 


Thank you. 

yml
(156 Bytes)
txt
(135 Bytes)
(2.64 KB)
py
(27 KB)

Hello,


Please try after removing

self.options.headless = True as its provided through argument  self.options.add_argument('--headless'). 


Also in this,

self.driver = webdriver.Chrome(options=self.options)

please add the path where the executable is present  as

self.driver = webdriver.Chrome(options=self.options, executable_path='/usr/bin/chromedriver')


Please let us know if the changes helped.

@thriveni I have effected those changes and am still having the same error. Am using PC, Windows OS and Windows 10. 

hi


my situation is the same... any update?

I'm facing the same issue like: The process started from chrome location /usr/bin/google-chrome is no longer running, so ChromeDriver is assuming that Chrome has crashed

Please ensure that executable path is also added along with below arguments.


options = webdriver.ChromeOptions()
options.add_argument("--disable-extensions")
options.add_argument("--headless")
options.add_argument("--disable-gpu")
options.add_argument("--no-sandbox")
self.driver = webdriver.Chrome(chrome_options=options, executable_path='/usr/bin/chromedriver')


Please refer to this article which gives example of dockerfile and sample working code.

@thriveni,


With Ref: https://support.zyte.com/support/solutions/articles/22000240310, I have set up all the components according to the mentioned article. Each time while I'm running the spider encountered with the issue like:


Traceback (most recent call last):
  File "/usr/local/lib/python3.8/site-packages/twisted/internet/defer.py", line 1418, in _inlineCallbacks
    result = g.send(result)
  File "/usr/local/lib/python3.8/site-packages/scrapy/crawler.py", line 86, in crawl
    self.spider = self._create_spider(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/scrapy/crawler.py", line 98, in _create_spider
    return self.spidercls.from_crawler(self, *args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/scrapy/spiders/__init__.py", line 49, in from_crawler
    spider = cls(*args, **kwargs)
  File "/usr/local/lib/python3.8/site-packages/project-1.0-py3.8.egg/YahooFinanceSel/spiders/yahooFinance.py", line 19, in __init__
  File "/usr/local/lib/python3.8/site-packages/selenium/webdriver/chrome/webdriver.py", line 76, in __init__
    RemoteWebDriver.__init__(
  File "/usr/local/lib/python3.8/site-packages/selenium/webdriver/remote/webdriver.py", line 157, in __init__
    self.start_session(capabilities, browser_profile)
  File "/usr/local/lib/python3.8/site-packages/selenium/webdriver/remote/webdriver.py", line 252, in start_session
    response = self.execute(Command.NEW_SESSION, parameters)
  File "/usr/local/lib/python3.8/site-packages/selenium/webdriver/remote/webdriver.py", line 321, in execute
    self.error_handler.check_response(response)
  File "/usr/local/lib/python3.8/site-packages/selenium/webdriver/remote/errorhandler.py", line 242, in check_response
    raise exception_class(message, screen, stacktrace)
selenium.common.exceptions.WebDriverException: Message: unknown error: Chrome failed to start: crashed.
  (unknown error: DevToolsActivePort file doesn't exist)
  (The process started from chrome location /usr/bin/google-chrome is no longer running, so ChromeDriver is assuming that Chrome has crashed.)
Login to post a comment