Start a new topic

Ruby, Selenium, crawlera-headless-proxy middle layer and ssl request

I am following these instructions:


https://docs.zyte.com/smart-proxy-manager/integrations/selenium.html 


I have set my headless proxy: 


```

docker run -p 3128:3128 scrapinghub/crawlera-headless-proxy -d -a API_KEY

```


It works with `http` requets:


```

 

proxy = Selenium::WebDriver::Proxy.new(type: :manual, http: "127.0.0.1:3128", ssl: "127.0.0.1:3128")


cap = Selenium::WebDriver::Remote::Capabilities.chrome(proxy: proxy)

driver = Selenium::WebDriver.for(:chrome, capabilities: cap)

driver.navigate.to("http://httpbin.org/ip")

puts "content2: #{driver.page_source}"

driver.close

```


But with `https` I have issues:


```

driver.navigate.to("https://httpbin.org/ip")

```


I see this message:


> There is something wrong with the proxy server or the address is incorrect.


(See attached)


What am I doing wrong?



1 Comment

Thanks to the support team I have an answer: 


This worked: 

 

```

docker run -p 3128:3128 scrapinghub/crawlera-headless-proxy -d -a API_KEY

```

 

Getting this cert: 

 

https://raw.githubusercontent.com/zytedata/zyte-smartproxy-headless-proxy/master/ca.crt

 

Cert Installation (see attach)

 

```

proxy = Selenium::WebDriver::Proxy.new(type: :manual, http: "127.0.0.1:3128", ssl: "127.0.0.1:3128")

cap = Selenium::WebDriver::Remote::Capabilities.chrome(proxy: proxy)

driver = Selenium::WebDriver.for(:chrome, capabilities: cap)

driver.navigate.to("https://httpbin.org/ip")

puts "content2: #{driver.page_source}"

```

 

Result:

```

{

 "origin": "108.62.70.151"

}

```

Login to post a comment