Section 2: Covers browser automation and webscraping topics. We utilized Requests, Selenium, and BeatuifulSoup4 modules to accomplish these kinds of tasks.
Browser automation and webscraping are two different things. Browser automation is when you use a script to automate a browser by manipulating the user interface. Webscraping is when you use a script to scrape data from a website via APIs or the URL.
-
Requests: a module that allows a script to interact with a site through APIs or URLs
-
Selenium: A module that functions as a hybrid between browser automation and webscraping.
-
BeautifulSoup: A module that is entirely focused on web scraping.
Section 3: covers the usage of the Requests module to call APIs.
Section 4: Covers the usage of the pathlib module to work with file paths.
- Pathlib is extremely helpful when it comes to working with file paths because it allows us to more easily create and reference path objects.