| Background | Description | Setup | Get data | Build Containers | ETL | Serve to Metabase | Metabase Dashboard | Interactive Map |
As part of the Hack The Artic's hackathon, the team developed an end-to-end containerized pipeline and developed a dashboard that tracks snow depth at varying levels of granualirty: the entire Vanager region, among the 3 localities, and among the 6 sections. It is also interactive and supports features such as hovering, zooming and data drill down.
Link to our submission here!
Due to climate change, global average temperatures have been rising. The changing snow depth in regions near the Arctic help us better understand the ecological impacts of climate change. Some areas with normally deep levels of snow in the past are now getting more shallow. This also affects the habitat of animals in the local area, and it may force them to migrate to other areas. Analyzing data on snow depth is thus crucial not only for validating and monitoring the effects of climate change, but also to track which areas are more affected, so that we can adapt accordingly.
In Norway, in the Varanger region, there are over 100 observation sites set up to track the snow depth. Snow depth is measured yearly. Here we developed a dashboard and visualizations to show and monitor the data from those sites. The team has developed an end-to-end data visualization pipeline automating the entire data ingestion and formatting process and deployed into a live interactive dashboard.
The dashboard tracks snow depth at varying levels of granualirty: the entire Vanager region, among the 3 localities, and among the 6 sections. It is also interactive and supports features such as hovering, zooming and data drill down.
This is the diagram which illustrates the systems design and workflow
cd
to this directory- Open a terminal, create a Python virtual environment using:
Windows
> python -m virtualenv .venv
Mac/Linux
$ make build
then activate it by executing
Windows:
> .venv\Scripts\activate.bat
(For Windows) Install dependencies using:
> python -m pip install -r requirements.txt
Get COAT data
Download snowdepth zip file and extract to data
folder with this command
Windows
> python app\scraper.py
> python app\etl.py
Mac/Linux
$ make run
Now that we have the CSV files in the data
folder, we can now build our Docker containers using this command:
docker-compose up
This command will build our dbt
, postgres
, and metabase
containers. This will also run our data loading, transformations, and modeling in the background.
During docker-compose
, dbt runs the following commands
dbt init snowdepth
: Creates the project folderdbt debug
: Checks the connection with the Postgres databasedbt deps
: Installs the test dependenciesdbt seed
: Loads the CSV files into staging tables in the database inpostgres
dbt run
: Runs the transformations and loads the data into the databasedbt test
: Tests the modelsdbt docs generate
: Generates the documentation of the dbt projectdbt docs serve
: Serves the documentation on a webserver
Now that the data is loaded and transformed in our database, we may now view it in http://localhost:3000. You may need to login, the credentials are
email: [email protected]
password: password1
We built a dashboard tracking the snow depth in the Varanger region in Norway. This is crucial for monitoring the effects of climate change on rising average global temperatures.
The dashboard tracks snow depth at varying levels of granualirty: the entire Vanager region, among the 3 localities, and among the 6 sections.
The dashboard is also interactive and supports features such as hovering and zooming.