The crawl is working fine without proxy but getting the error on adding proxy.
Please advise.
2 Votes
3 Comments
Sorted by
K
Karan Khannaposted
over 6 years ago
Thanks, nestor. I had to use some other library because of this, that's why you are seeing mostly successful requests in my account. I'll try bumping up the timeout to 180 seconds.
0 Votes
nestorposted
over 6 years ago
Admin
The error is not Crawlera related, it's specific to Chrome. In your logs I see mostly successful requests, the only problem I see is that your client closes connection at 15 seconds which is not recommended. Try increasing the timeout to 180 seconds.
I'm using https://github.com/yujiosaka/headless-chrome-crawler#readme crawler with Crawlera proxy following the tutorial at https://support.scrapinghub.com/support/solutions/articles/22000220800-using-crawlera-with-puppeteer but getting error Error: net::ERR_EMPTY_RESPONSE.
The crawl is working fine without proxy but getting the error on adding proxy.
Please advise.
2 Votes
3 Comments
Karan Khanna posted over 6 years ago
Thanks, nestor. I had to use some other library because of this, that's why you are seeing mostly successful requests in my account. I'll try bumping up the timeout to 180 seconds.
0 Votes
nestor posted over 6 years ago Admin
The error is not Crawlera related, it's specific to Chrome. In your logs I see mostly successful requests, the only problem I see is that your client closes connection at 15 seconds which is not recommended. Try increasing the timeout to 180 seconds.
0 Votes
Karan Khanna posted over 6 years ago
Any updates?
0 Votes
Login to post a comment