I'm trying to get Crawlera to work with PhantomJS. I have it working with Curl, and I also have PhantomJS working without Crawlera.
I am trying to visit a https site, and am getting error `bad_proxy_auth` with these settings:
Capybara.register_driver :poltergeist do |app|
Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--proxy-auth=#{proxy_auth}", "--ignore-ssl-errors=true"])
end
I get the error: failed to reach server, check DNS and/or server status.
1 Votes
S
Sergey Volkovposted
almost 6 years ago
try this:
Capybara.register_driver :poltergeist do |app|
Capybara::Poltergeist::Driver.new(app, phantomjs_options: ["--proxy=#{proxy}", "--ignore-ssl-errors=true"])
end
I'm trying to get Crawlera to work with PhantomJS. I have it working with Curl, and I also have PhantomJS working without Crawlera.
I am trying to visit a https site, and am getting error `bad_proxy_auth` with these settings:
0 Votes
2 Comments
railscam posted over 7 years ago
I have the exact same problem - my setup is:
Capybara.register_driver :poltergeist do |app|
Capybara::Poltergeist::Driver.new(app, js_errors: false, phantomjs_options: ['--load-images=no', '--ssl-protocol=any', '--proxy=proxy.crawlera.com:8010', "--proxy-auth=#{proxy_auth}", "--ssl-client-certificate-file=#{cert}"])
end
I get the error: failed to reach server, check DNS and/or server status.
1 Votes
Sergey Volkov posted almost 6 years ago
try this:
plus this:
0 Votes
Login to post a comment