I have setup mu Python client to use crawlera proxy based on a sample on thee crawlera documention. But I get bad auth (407) error message from requests.
I double checked the API-key, host and port and other settings. But everything seems correct!
@thriveni, I am facing the same issue right now - I have created several api keys, some of them working (though 503 responses only) and some of them fail with 407 response. I use crawlera with scrapy 1.5. What im doing wrong?
1 Votes
r
royyosefposted
over 6 years ago
Same issue no clue
0 Votes
D
Devteam Socialposted
over 6 years ago
yes, getting same issue
Connection →close
Date →Tue, 14 Aug 2018 18:53:07 GMT
Proxy-Authenticate →Basic realm="Crawlera"
Proxy-Connection →close
Transfer-Encoding →chunked
X-Crawlera-Error →bad_proxy_auth
X-Crawlera-Version →1.33.1-68f021
kindly update
0 Votes
T
Tech Servicesposted
over 6 years ago
We are getting same issue on bad proxy authentication.
X-Crawlera-Error →bad_proxy_auth
We are using request npm package with updated version using node js
how to set the proxy_Authorization in request npm package
Can you please help me to get my client working?
0 Votes
vaz posted about 7 years ago Best Answer
Sharing your own solution provided:
Best,
Pablo
1 Votes
8 Comments
vaz posted about 7 years ago Answer
Sharing your own solution provided:
Best,
Pablo
1 Votes
anichesine posted about 7 years ago
I'm sorry, could you elaborate on the requests client (requests module?) - what is that and how do you update it?
Same issue here - I'm running this on c9.io, not sure if I can update the requests module.
Thank you for your help.
0 Votes
software industry posted over 6 years ago
Using Crawlera's sample code for a GET request with a proxy.
import requests
url = "http://httpbin.org/ip"
proxy_host = "proxy.crawlera.com"
proxy_port = "8010"
proxy_auth = "<APIKEY>:" # Make sure to include ':' at the end
proxies = {
"https": "https://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port),
"http": "http://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port)
}
r = requests.get(url, proxies=proxies, verify=False)
I get a 407 Bad Proxy Auth error. I've tripled check that the API_KEY is correct. Here's the output print from above:
Response Headers:
{'Proxy-Connection': 'close', 'Proxy-Authenticate': 'Basic realm="Crawlera"', 'Transfer-Encoding': 'chunked', 'Connection': 'close', 'Date': 'Mon, 26 Mar 2018 11:18:05 GMT', 'X-Crawlera-Error': 'bad_proxy_auth', 'X-Crawlera-Version': '1.32.0-07c786'}
Requests is already update.
$ pip freeze |grep requests
requests==2.8.1
Any ideas?
0 Votes
thriveni posted over 6 years ago Admin
Are you still facing this issue?
0 Votes
фф ффф posted over 6 years ago
@thriveni, I am facing the same issue right now - I have created several api keys, some of them working (though 503 responses only) and some of them fail with 407 response. I use crawlera with scrapy 1.5. What im doing wrong?
1 Votes
royyosef posted over 6 years ago
Same issue no clue
0 Votes
Devteam Social posted over 6 years ago
yes, getting same issue
Connection →close
Date →Tue, 14 Aug 2018 18:53:07 GMT
Proxy-Authenticate →Basic realm="Crawlera"
Proxy-Connection →close
Transfer-Encoding →chunked
X-Crawlera-Error →bad_proxy_auth
X-Crawlera-Version →1.33.1-68f021
kindly update
0 Votes
Tech Services posted over 6 years ago
We are getting same issue on bad proxy authentication.
X-Crawlera-Error →bad_proxy_auth
We are using request npm package with updated version using node js
how to set the proxy_Authorization in request npm package
0 Votes
Login to post a comment