This project is a Python script that extracts data from a Zillow clone website and automatically fills a Google Form with the scraped data. It was developed as a capstone project for the "100 Days of Code: The Complete Python Pro Bootcamp" course on Udemy, which can be found here.
To run this project, you'll need:
- Python 3.x
- BeautifulSoup library
- Requests library
- Selenium library
- Chrome WebDriver
- Clone the repository to your local machine:
git clone https://github.com/Dhanush-S-Gowda/webscraping-and-automation-with-beautifulsoup-selenium.git
- Install the required Python packages:
pip install beautifulsoup4 requests selenium
- Download the Chrome WebDriver from here and place it in your system's PATH.
- Modify the
WEBSITE_LINK
andFORM_LINK
variables in the script with your desired website and Google Form URLs, respectively. - Get your
User-Agent
value from (https://myhttpheader.com/) and add it to your header - Run the script:
python scraper.py