Ruby, Selenium, crawlera-headless-proxy middle layer and ssl request
Posted about 4 years ago by d2clon
Post a topicPeople who like this
Delete Comment
This post will be deleted permanently. Are you sure?
Code Snippet
Html
Html
Css
JavaScript
Sass
Xml
Ruby
PHP
Java
C#
C++
ObjectiveC
Perl
Python
VB
SQL
Generic Language
I am following these instructions:
https://docs.zyte.com/smart-proxy-manager/integrations/selenium.html
I have set my headless proxy:
```
docker run -p 3128:3128 scrapinghub/crawlera-headless-proxy -d -a API_KEY
```
It works with `http` requets:
```
cap = Selenium::WebDriver::Remote::Capabilities.chrome(proxy: proxy)
driver = Selenium::WebDriver.for(:chrome, capabilities: cap)
driver.navigate.to("http://httpbin.org/ip")
puts "content2: #{driver.page_source}"
driver.close
```
But with `https` I have issues:
```
driver.navigate.to("https://httpbin.org/ip")
```
I see this message:
> There is something wrong with the proxy server or the address is incorrect.
(See attached)
What am I doing wrong?
Attachments (1)
Screenshot 2....png
41.6 KB
0 Votes
1 Comments
d2clon posted about 4 years ago
Thanks to the support team I have an answer:
This worked:
```
docker run -p 3128:3128 scrapinghub/crawlera-headless-proxy -d -a API_KEY
```
Getting this cert:
- https://raw.githubusercontent.com/zytedata/zyte-smartproxy-headless-proxy/master/ca.crt
Cert Installation (see attach)
```
proxy = Selenium::WebDriver::Proxy.new(type: :manual, http: "127.0.0.1:3128", ssl: "127.0.0.1:3128")
cap = Selenium::WebDriver::Remote::Capabilities.chrome(proxy: proxy)
driver = Selenium::WebDriver.for(:chrome, capabilities: cap)
driver.navigate.to("https://httpbin.org/ip")
puts "content2: #{driver.page_source}"
```
Result:
```
{
"origin": "108.62.70.151"
}
```
Attachments (1)
Screenshot 2....png
193 KB
0 Votes
Login to post a comment