Skip to content

Automatically log into Linkedin and scrape Linkedin Permium Insights data on a per-company basis, then visualize.

Notifications You must be signed in to change notification settings

jaredfiacco2/LinkedinPremium_InsightsScraper

Repository files navigation

LinkedIn


Logo

Automatically log into Linkedin, Scrape insights data, Store it locally, and Visualize with Jupyter Notebook and Tableau.

Table of Contents
  1. About The Project
  2. Prerequisites & Instructions
  3. Contact

About The Project

  • This code uses python and selenium to log into a Linkedin Premium Account, loop through a list of companies, and harvest the hidden table data from the HTML.
  • I then used pandas to store the data in .pkl files.
  • Finaly, I created data visualizations using Pandas and MatplotLib in Jupyter Notebook as well as Tableau.

Process Map

Built With

Prerequisites & Instructions

  1. Installing all Required Packages
pip install -r requirements.txt
  1. Use Python to run 'main.py'. This will log into Linkedin, loop through a list of companies, and scrape the data from hidden tables, saving it to pickle files for future visualizations.

linkedin premium data that gets scraped

  1. Use Jupyter Notebook to manipulate the dat frames and create visualizations in matplotlib.

jupyter notebook for dataframe manipulations and visualizations

  1. Use Tableau to visualize the data.

use tableau for visualization

Contact

Jared Fiacco - [email protected]

A GCP Data Engineering Project of Mine: Publish Computer Statistics to Pub/Sub, Use Cloud Functions to Store in BigQuery

About

Automatically log into Linkedin and scrape Linkedin Permium Insights data on a per-company basis, then visualize.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published