Start a new topic

Crawlera proxy doesn't respond every time

```curl -x -U my_api_key:```

successfully returns me a page but not every time.

It can work well for 20 minutes then 5 minutes it returns:

```curl: (52) Empty reply from server```

This works this way all the time.

In not working periods I can establish connection with 

"telnet 8010"

but didn't get a reply. Connection is closing after typing just first line.



Connected to

Escape character is '^]'.


Connection closed by foreign host.


I can see this behaviour only on machine where my scrapy crawler works with ip:

It has 100% uptime from machines with another ip address.

So I assume your proxy filters my IP somehow.

Please help me to figure out what happens.

Login to post a comment