I find that the concurrent requests is fewer than I expect, and it takes too long to scrape the data on amazon.com.
My question is: is there any way to speed up amazon.com scraping based on Crawlera as a proxy server?
Thanks,
CK
Best Answer
v
vaz
said
about 7 years ago
Hi CK, many customers upgraded to Enterprise accounts to take advantage of our dedicated pool of proxies and get ban assistance as Crawl tuning features. This is particularly useful to fetch data from big guys like Amazon.
If interested you can tell us more about your needs and expected results through:
Our team can evaluate the best alternatives and provide a free budget to help you with your project.
Best regards,
Pablo
1 Comment
vaz
said
about 7 years ago
Answer
Hi CK, many customers upgraded to Enterprise accounts to take advantage of our dedicated pool of proxies and get ban assistance as Crawl tuning features. This is particularly useful to fetch data from big guys like Amazon.
If interested you can tell us more about your needs and expected results through:
tybournecapital
Hi Crawlera team,
I am using crawlera C200 to scrape amazon.com.
I find that the concurrent requests is fewer than I expect, and it takes too long to scrape the data on amazon.com.
My question is: is there any way to speed up amazon.com scraping based on Crawlera as a proxy server?
Thanks,
CK
Hi CK, many customers upgraded to Enterprise accounts to take advantage of our dedicated pool of proxies and get ban assistance as Crawl tuning features. This is particularly useful to fetch data from big guys like Amazon.
If interested you can tell us more about your needs and expected results through:
https://scrapinghub.com/quote
Our team can evaluate the best alternatives and provide a free budget to help you with your project.
Best regards,
Pablo
vaz
Hi CK, many customers upgraded to Enterprise accounts to take advantage of our dedicated pool of proxies and get ban assistance as Crawl tuning features. This is particularly useful to fetch data from big guys like Amazon.
If interested you can tell us more about your needs and expected results through:
https://scrapinghub.com/quote
Our team can evaluate the best alternatives and provide a free budget to help you with your project.
Best regards,
Pablo
-
Crawlera 503 Ban
-
Website redirects
-
Error Code 429 Too Many Requests
-
Bing
-
Subscribed to Crawlera but saying Not Subscribed
-
Selenium with c#
-
Using Crawlera with browsermob
-
CRAWLERA_PRESERVE_DELAY leads to error
-
How to connect Selenium PhantomJS to Crawlera?
See all 399 topics