Start a new topic

web scraping on landwatch into excel using python

- i am trying to web scrape using python - beautiful soup from

   www.landwatch.com and have it pasted into the google spread sheets

   https://docs.google.com/spreadsheets/d/1XbkOt851PBYcLLQT6fdZex5mTsg5blUhFY_oiDMtUZo/edit?usp=sharing

   scrape for all columns for 50 all states for 3,145 counties(the ones 
   listed in the excel) 

   what to webscrape for 

   theres are the columns in the google spreadsheet and what to scrape 
   for (each column in the spreadsheet

LW total = total available parcels for sale on land watch

price setting amount 50K  

greater than = <

Less than = >

acre amount etc




   make land watch has these settings turned on( on modify results) to 
   insure that the total amount for sale in that county of land is 
   accurate

 united states

 (state)

 Land available 


   I am semi new to python(beautiful soup etc) and was wondering if this 
   is possible to webscrape into my google sheets any good tutorials or 
   videos on web scraping that might help? 


   thanks andrew

1 Comment

I would add all the countries into a CSV file and load that into the spider to create all the start_urls that you want to crawl. The output can again be written to CSV and imported into a Google sheet. What is the part you are struggling with?

Login to post a comment