Crawlera is working quite slow, it's processing a single request in average 6 to 8 seconds. I'm thiking of making concurrent sessions. I'm allowed 10 sessions in my c10 plan. My question is how can I make concurrent sessions in pythons and will it increase the speed because 8 seconds average is very time consuming. I need to parse thousands of result.
0 Votes
nestor posted
over 7 years ago
AdminBest Answer
You could try using grequests, or better yet you can always try Scrapy. It's as simple as setting CONCURRENT_REQUESTS = ##.
0 Votes
1 Comments
nestorposted
over 7 years ago
AdminAnswer
You could try using grequests, or better yet you can always try Scrapy. It's as simple as setting CONCURRENT_REQUESTS = ##.
Crawlera is working quite slow, it's processing a single request in average 6 to 8 seconds. I'm thiking of making concurrent sessions. I'm allowed 10 sessions in my c10 plan. My question is how can I make concurrent sessions in pythons and will it increase the speed because 8 seconds average is very time consuming. I need to parse thousands of result.
0 Votes
nestor posted over 7 years ago Admin Best Answer
You could try using grequests, or better yet you can always try Scrapy. It's as simple as setting CONCURRENT_REQUESTS = ##.
0 Votes
1 Comments
nestor posted over 7 years ago Admin Answer
You could try using grequests, or better yet you can always try Scrapy. It's as simple as setting CONCURRENT_REQUESTS = ##.
0 Votes
Login to post a comment