Skip to content

Commit

Permalink
fixed image paths
Browse files Browse the repository at this point in the history
  • Loading branch information
hibajamal committed Jan 16, 2024
1 parent 74486f9 commit abe43eb
Showing 1 changed file with 9 additions and 9 deletions.
18 changes: 9 additions & 9 deletions docs/website/blog/2024-01-10-dlt-mode.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
---
slug: dlt-mode-blog
title: "The Modern Data Stack with dlt & Mode"
image: /img/blog-mode-dataflow1.png
image: https://storage.googleapis.com/dlt-blog-images/blog-mode-dataflow1.png
authors:
name: Hiba Jamal
title: Data Science intern at dlthub
Expand Down Expand Up @@ -70,7 +70,7 @@ In 2022, dbt introduced its semantic layer to address the challenge faced by BI
<div style={{ display: 'flex' }}>
<div style={{ flex: '1' }}>

![semantic layer](/img/blog-mode-semantic-layer-dbt.jpg)
![semantic layer](https://storage.googleapis.com/dlt-blog-images/blog-mode-semantic-layer-dbt.jpg)

</div>

Expand All @@ -94,18 +94,18 @@ There are two ways to use dlt and Mode to uncomplicate your workflows.

### 1. Extract, Normalize and Load with dlt and Visualize with Mode

![data flow 1](/img/blog-mode-dataflow1.png)
![data flow 1](https://storage.googleapis.com/dlt-blog-images/blog-mode-dataflow1.png)

The data we are looking at comes from the source: Shopify. The configuration to initialize a Shopify source can be found in the dltHub docs. Once a dlt pipeline is initialized for Shopify, data from the source can be streamed into the destination of your choice. In this demo, we have chosen for it to be BigQuery destination. From where, it is connected to Mode. Mode’s SQL editor is where you can model your data for reports - removing all unnecessary columns or adding/subtracting the tables you want to be available to teams.

![sql editor](/img/blog-mode-editor.png)
![sql editor](https://storage.googleapis.com/dlt-blog-images/blog-mode-editor.png)

This stage can be perceived as Mode’s own data transformation layer, or semantic modelling layer, depending on which team/designation the user belongs to. Next, the reporting step is also simplified in Mode.

<div style={{ display: 'flex' }}>
<div style={{ flex: '1' }}>

![data flow 1](/img/blog-mode-report1.png)
![data flow 1](https://storage.googleapis.com/dlt-blog-images/blog-mode-report1.png)

</div>

Expand All @@ -119,7 +119,7 @@ With the model we just created, called Products, a chart can be instantly create

### 2. Use dlt from within the python workspace in Mode

![data flow 2](/img/blog-mode-dataflow2.png)
![data flow 2](https://storage.googleapis.com/dlt-blog-images/blog-mode-dataflow2.png)

In this demo, we’ll forego the authentication issues of connecting to a data warehouse, and choose the DuckDB destination to show how the Python environment within Mode can be used to initialize a data pipeline and dump normalized data into a destination. In order to see how it works, we first install dlt[duckdb] into the Python environment.

Expand All @@ -140,15 +140,15 @@ pipeline = dlt.pipeline(

And then, we pass our data into the pipeline, and check out the load information. Let's look at what the Mode cell outputs:

![load information](/img/blog-mode-load-info.png)
![load information](https://storage.googleapis.com/dlt-blog-images/blog-mode-load-info.png)

Let’s check if our pipeline exists within the Mode ecosystem:

![mode file system](/img/blog-mode-env-dir.png)
![mode file system](https://storage.googleapis.com/dlt-blog-images/blog-mode-env-dir.png)

Here we see the pipeline surely exists. Courtesy of Mode, anything that exists within the pipeline that we can query through Python can also be added to the final report or dashboard using the “Add to Report” button.

![add to report button](/img/blog-mode-add-to-report.png)
![add to report button](https://storage.googleapis.com/dlt-blog-images/blog-mode-add-to-report.png)

Once a pipeline is initialized within Mode’s Python environment, the Notebook cell can be frozen, and every consecutive run of the notebook can be a call to the data source, updating the data warehouse and reports altogether!

Expand Down

0 comments on commit abe43eb

Please sign in to comment.