Start a new topic

Crawlera proxy doesn't respond every time

```curl -x proxy.crawlera.com:8010 -U my_api_key: http://httpbin.org/ip```

successfully returns me a page but not every time.


It can work well for 20 minutes then 5 minutes it returns:

```curl: (52) Empty reply from server```

This works this way all the time.


In not working periods I can establish connection with 

"telnet proxy.crawlera.com 8010"

but didn't get a reply. Connection is closing after typing just first line.


```

Trying 64.58.126.143...

Connected to proxy.crawlera.com.

Escape character is '^]'.

GET http://httpbin.org/ip HTTP/1.1

Connection closed by foreign host.

```


I can see this behaviour only on machine where my scrapy crawler works with ip: 89.208.114.6

It has 100% uptime from machines with another ip address.

So I assume your proxy filters my IP somehow.


Please help me to figure out what happens.


Login to post a comment