Start a new topic

Crawlera Scraping with a Python Generator (examples?)

Hi. With Crawlera (python 3) , I'm iterating through list of ~29k urls--scraping each page with beautiful soup...then going to the next url--in a for loop. Consecutive.


All at once:

I 'think' I read that it's  possible to send all the urls at once with a python generator...

as each page request is returned, scrape for the data, and then insert into sql. Non-consecutive


Question: Does anyone have a simple example of this? Is it possible and/or recommended?


Thanks in advance 



Login to post a comment