Crawlera with PhantomJS (Poltergeist)

Posted over 7 years ago by atomant

Post a topic
Un Answered
a
atomant

I'm trying to get Crawlera to work with PhantomJS. I have it working with Curl, and I also have PhantomJS working without Crawlera.


I am trying to visit a https site, and am getting error `bad_proxy_auth` with these settings:

Capybara.register_driver :poltergeist do |app|
  Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--proxy-auth=#{proxy_auth}", "--ignore-ssl-errors=true"])
end
 


0 Votes


2 Comments

Sorted by
r

railscam posted over 7 years ago

I have the exact same problem - my setup is: 


      Capybara.register_driver :poltergeist do |app|

      Capybara::Poltergeist::Driver.new(app, js_errors: false, phantomjs_options: ['--load-images=no', '--ssl-protocol=any', '--proxy=proxy.crawlera.com:8010', "--proxy-auth=#{proxy_auth}", "--ssl-client-certificate-file=#{cert}"])

      end


I get the error: failed to reach server, check DNS and/or server status.

1 Votes

S

Sergey Volkov posted almost 6 years ago

try this:

Capybara.register_driver :poltergeist do |app|
  Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--ignore-ssl-errors=true"])
end

plus this:

Capybara.current_session.driver.headers = { 'Proxy-Authorization' => "Basic #{Base64.encode64(APIKEY:)}" }

 

0 Votes

Login to post a comment