I have setup mu Python client to use crawlera proxy based on a sample on thee crawlera documention. But I get bad auth (407) error message from requests.
I double checked the API-key, host and port and other settings. But everything seems correct!
@thriveni, I am facing the same issue right now - I have created several api keys, some of them working (though 503 responses only) and some of them fail with 407 response. I use crawlera with scrapy 1.5. What im doing wrong?
1 person likes this
r
royyosef
said
over 6 years ago
Same issue no clue
D
Devteam Social
said
about 6 years ago
yes, getting same issue
Connection →close
Date →Tue, 14 Aug 2018 18:53:07 GMT
Proxy-Authenticate →Basic realm="Crawlera"
Proxy-Connection →close
Transfer-Encoding →chunked
X-Crawlera-Error →bad_proxy_auth
X-Crawlera-Version →1.33.1-68f021
kindly update
T
Tech Services
said
about 6 years ago
We are getting same issue on bad proxy authentication.
X-Crawlera-Error →bad_proxy_auth
We are using request npm package with updated version using node js
how to set the proxy_Authorization in request npm package
amrollah
Can you please help me to get my client working?
Sharing your own solution provided:
Best,
Pablo
- Oldest First
- Popular
- Newest First
Sorted by Oldest Firstvaz
Sharing your own solution provided:
Best,
Pablo
1 person likes this
anichesine
I'm sorry, could you elaborate on the requests client (requests module?) - what is that and how do you update it?
Same issue here - I'm running this on c9.io, not sure if I can update the requests module.
Thank you for your help.
software industry
Using Crawlera's sample code for a GET request with a proxy.
import requests
url = "http://httpbin.org/ip"
proxy_host = "proxy.crawlera.com"
proxy_port = "8010"
proxy_auth = "<APIKEY>:" # Make sure to include ':' at the end
proxies = {
"https": "https://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port),
"http": "http://{}@{}:{}/".format(proxy_auth, proxy_host, proxy_port)
}
r = requests.get(url, proxies=proxies, verify=False)
I get a 407 Bad Proxy Auth error. I've tripled check that the API_KEY is correct. Here's the output print from above:
Response Headers:
{'Proxy-Connection': 'close', 'Proxy-Authenticate': 'Basic realm="Crawlera"', 'Transfer-Encoding': 'chunked', 'Connection': 'close', 'Date': 'Mon, 26 Mar 2018 11:18:05 GMT', 'X-Crawlera-Error': 'bad_proxy_auth', 'X-Crawlera-Version': '1.32.0-07c786'}
Requests is already update.
$ pip freeze |grep requests
requests==2.8.1
Any ideas?
thriveni
Are you still facing this issue?
фф ффф
@thriveni, I am facing the same issue right now - I have created several api keys, some of them working (though 503 responses only) and some of them fail with 407 response. I use crawlera with scrapy 1.5. What im doing wrong?
1 person likes this
royyosef
Same issue no clue
Devteam Social
yes, getting same issue
Connection →close
Date →Tue, 14 Aug 2018 18:53:07 GMT
Proxy-Authenticate →Basic realm="Crawlera"
Proxy-Connection →close
Transfer-Encoding →chunked
X-Crawlera-Error →bad_proxy_auth
X-Crawlera-Version →1.33.1-68f021
kindly update
Tech Services
We are getting same issue on bad proxy authentication.
X-Crawlera-Error →bad_proxy_auth
We are using request npm package with updated version using node js
how to set the proxy_Authorization in request npm package
-
Crawlera 503 Ban
-
Amazon scraping speed
-
Website redirects
-
Error Code 429 Too Many Requests
-
Bing
-
Subscribed to Crawlera but saying Not Subscribed
-
Selenium with c#
-
Using Crawlera with browsermob
-
CRAWLERA_PRESERVE_DELAY leads to error
-
How to connect Selenium PhantomJS to Crawlera?
See all 401 topics