What is the recommended way of handling scenarios where we run out credits? This happened to us recently which meant our scrapers were failing until we upgraded our subscription to C200. We'd like to simply fall back on regular direct connections in such a case.
We use Python with Requests and Scrapy. There doesn't appear to be anything in the docs that suggests how to check for this specific scenario.
Best Answer
n
nestor
said
almost 5 years ago
The recommended way is to upgrade your plan, we send an email once you reach 80% of your request quota.
There's nothing out-of-the-box to do what you are asking. You'll need to implement something that detects the HTTP response code "403 user suspended" and start another crawl process where settings have CRAWLERA_ENABLED=FALSE.
1 Comment
nestor
said
almost 5 years ago
Answer
The recommended way is to upgrade your plan, we send an email once you reach 80% of your request quota.
There's nothing out-of-the-box to do what you are asking. You'll need to implement something that detects the HTTP response code "403 user suspended" and start another crawl process where settings have CRAWLERA_ENABLED=FALSE.
amrish_beauhurst
What is the recommended way of handling scenarios where we run out credits? This happened to us recently which meant our scrapers were failing until we upgraded our subscription to C200. We'd like to simply fall back on regular direct connections in such a case.
We use Python with Requests and Scrapy. There doesn't appear to be anything in the docs that suggests how to check for this specific scenario.
The recommended way is to upgrade your plan, we send an email once you reach 80% of your request quota.
There's nothing out-of-the-box to do what you are asking. You'll need to implement something that detects the HTTP response code "403 user suspended" and start another crawl process where settings have CRAWLERA_ENABLED=FALSE.
nestor
The recommended way is to upgrade your plan, we send an email once you reach 80% of your request quota.
There's nothing out-of-the-box to do what you are asking. You'll need to implement something that detects the HTTP response code "403 user suspended" and start another crawl process where settings have CRAWLERA_ENABLED=FALSE.
-
Crawlera 503 Ban
-
Amazon scraping speed
-
Website redirects
-
Error Code 429 Too Many Requests
-
Bing
-
Subscribed to Crawlera but saying Not Subscribed
-
Selenium with c#
-
Using Crawlera with browsermob
-
CRAWLERA_PRESERVE_DELAY leads to error
-
How to connect Selenium PhantomJS to Crawlera?
See all 389 topics