Start a new topic

Crawlera with PhantomJS (Poltergeist)

I'm trying to get Crawlera to work with PhantomJS. I have it working with Curl, and I also have PhantomJS working without Crawlera.


I am trying to visit a https site, and am getting error `bad_proxy_auth` with these settings:

Capybara.register_driver :poltergeist do |app|
  Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--proxy-auth=#{proxy_auth}", "--ignore-ssl-errors=true"])
end
 



I have the exact same problem - my setup is: 


      Capybara.register_driver :poltergeist do |app|

      Capybara::Poltergeist::Driver.new(app, js_errors: false, phantomjs_options: ['--load-images=no', '--ssl-protocol=any', '--proxy=proxy.crawlera.com:8010', "--proxy-auth=#{proxy_auth}", "--ssl-client-certificate-file=#{cert}"])

      end


I get the error: failed to reach server, check DNS and/or server status.


1 person likes this

try this:

Capybara.register_driver :poltergeist do |app|
  Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--ignore-ssl-errors=true"])
end

plus this:

Capybara.current_session.driver.headers = { 'Proxy-Authorization' => "Basic #{Base64.encode64(APIKEY:)}" }

 

Login to post a comment