Skip to content

Commit

Permalink
docs: dlt plus: custom page header (#2337)
Browse files Browse the repository at this point in the history
* Add dlt+ logo inside breadcrumbs for every DLT Plus page
* Add custom Admonition before H1 for every DLT Plus page
* fix: ensure UTF-8 encoding when reading versions.json and clean up whitespace in docusaurus.config.js
  • Loading branch information
stantsvek authored Mar 5, 2025
1 parent 2812f02 commit ad96e95
Show file tree
Hide file tree
Showing 33 changed files with 227 additions and 142 deletions.
3 changes: 0 additions & 3 deletions docs/website/docs/_plus_admonition.md

This file was deleted.

7 changes: 2 additions & 5 deletions docs/website/docs/plus/core-concepts/cache.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,13 @@
---
title: "🧪 Cache"
title: "Cache 🧪"
description: Execute data transformations in your local cache
keywords: ["dlt+", "cache", "transformations"]
---

import Link from '../../_plus_admonition.md';

<Link/>

:::caution
🚧 This feature is under development, and the interface may change in future releases. Interested in becoming an early tester? [Join dlt+ early access](https://info.dlthub.com/waiting-list)
:::

The dlt+ Cache is a temporary local storage created by dlt+ to enhance development workflows. It allows you to efficiently run local transformations, materialize dbt models, and test your queries before deploying them to production.

## How it works
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/core-concepts/datasets.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Datasets

import Link from '../../_plus_admonition.md';

<Link/>

A dataset is a physical collection of data and dlt metadata, including the schema on a destination. One destination can have multiple datasets; for now, datasets are bound to a physical destination, but this may change in future iterations.

By treating datasets as individual entities, dlt+ enables data cataloging and data governance.
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/core-concepts/profiles.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,6 @@ title: Profiles
keywords: [dlt+, profiles]
---

import Link from '../../_plus_admonition.md';

<Link/>

A profile is a set of configurations and secrets defined for a specific use case. Profiles provide a way to manage different configurations for different environments.

They are defined in the `dlt.yml` under the `profiles` section.
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/core-concepts/project.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,5 @@
# Project

import Link from '../../_plus_admonition.md';

<Link/>

A dlt+ Project offers developers a declarative approach for defining data workflow components: sources, destinations, pipelines, transformations, parameters, etc. It follows an opinionated structure centered around a Python manifest file `dlt.yml`, where all dlt entities are defined in an organized way. The manifest file acts like a single source of truth for data pipelines, keeping all teams aligned.

The project layout has the following components:
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/ecosystem/delta.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Delta destination
keywords: [delta, delta lake]
---

import Link from '../../_plus_admonition.md';

<Link/>

# Delta

The Delta destination is based on the [filesystem destination](../../dlt-ecosystem/destinations/filesystem.md) in dlt. All configuration options from the filesystem destination can be configured as well.
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/ecosystem/iceberg.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Iceberg destination
keywords: [Iceberg, pyiceberg]
---

import Link from '../../_plus_admonition.md';

<Link/>

# Iceberg

The Iceberg destination is based on the [filesystem destination](../../dlt-ecosystem/destinations/filesystem.md) in dlt. All configuration options from the filesystem destination can be configured as well.
Expand Down
10 changes: 3 additions & 7 deletions docs/website/docs/plus/ecosystem/ms-sql.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: MS SQL replication
keywords: [MSSQL, CDC, Change Tracking, MSSQL replication]
---

import Link from '../../_plus_admonition.md';

<Link/>

# MS SQL replication

dlt+ provides a comprehensive solution for syncing an MS SQL Server table using [Change Tracking](https://learn.microsoft.com/en-us/sql/relational-databases/track-changes/about-change-tracking-sql-server), a solution similar to CDC. By leveraging SQL Server's native Change Tracking feature, you can efficiently load incremental data changes — including inserts, updates, and deletes — into your destination.
Expand Down Expand Up @@ -127,7 +123,7 @@ When running for the first time, it is necessary to pass the `tracking_version`

### Incremental loading

After the initial load, you can run the `create_change_tracking_table` resource on a schedule to load only the changes since the last tracking version using SQL Server’s `CHANGETABLE` function.
After the initial load, you can run the `create_change_tracking_table` resource on a schedule to load only the changes since the last tracking version using SQL Server’s `CHANGETABLE` function.
You do not need to pass `initial_tracking_version` anymore, since this is automatically stored in the `dlt` state.

```py
Expand Down Expand Up @@ -307,14 +303,14 @@ pipeline.run(incremental_resource)

### Hard deletes

By default, `hard_delete` is set to `True`, meaning hard deletes are performed, i.e., rows deleted in the source will be permanently removed from the destination.
By default, `hard_delete` is set to `True`, meaning hard deletes are performed, i.e., rows deleted in the source will be permanently removed from the destination.

Replicated data allows for NULLs for not nullable columns when a record is deleted. To avoid additional tables that hold deleted rows and additional merge steps,
`dlt` emits placeholder values that are stored in the staging dataset only.

### Soft deletes

If `hard_delete` is set to `False`, soft deletes are performed, i.e., rows deleted in the source will be marked as deleted but not physically removed from the destination.
If `hard_delete` is set to `False`, soft deletes are performed, i.e., rows deleted in the source will be marked as deleted but not physically removed from the destination.

In this case, the destination schema must accept NULLs for the replicated columns, so make sure you pass the `remove_nullability_adapter` adapter to the `sql_table` resource:

Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/features/ai.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Explore data in your dlt+ project with Claude Desktop using the Mod
keywords: [dlt+, Claude Desktop, MCP, Model Context Protocol]
---

import Link from '../../_plus_admonition.md';

<Link/>

# AI workflows

As part of dlt+, we are developing several tools to enhance development with AI workflows. The first of these is a [Model Context Protocol (MCP)](https://modelcontextprotocol.io) plugin for Claude Desktop for data exploration.
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/features/data-access.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Provide secure data access to your organization
keywords: ["data access", "security", "contracts", "data sharing"]
---

import Link from '../../_plus_admonition.md';

<Link/>

# Secure data access and sharing

dlt+ makes it easy for end-users like data scientists or analysts to access high-quality production data in a secure and Python-friendly way. A [dlt+ Project](../core-concepts/project.md) exposes a standard Python API which connects to the production data using an "access" [profile](../core-concepts/profiles.md). This profile can be configured to specify how users are allowed to interact with the data, e.g., by applying restrictions on datasets that are not allowed to be modified.
Expand Down
6 changes: 0 additions & 6 deletions docs/website/docs/plus/features/projects.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,7 @@
# Project

import Link from '../../_plus_admonition.md';

<Link/>


<img src="https://storage.googleapis.com/dlt-blog-images/plus/dlt_plus_projects.png" width="500"/>


[dlt+ Project](../core-concepts/project.md) provides a structured and opinionated approach to organizing data workflows while implementing best practices for data engineering teams. dlt+ Project automates key processes such as data loading, data transformations, data catalogs, and data governance, and enables different members of the data teams to work more easily with each other.

With dlt+ Project, you can efficiently manage your data workflows by:
Expand Down
6 changes: 1 addition & 5 deletions docs/website/docs/plus/features/quality/data-quality.md
Original file line number Diff line number Diff line change
@@ -1,13 +1,9 @@
---
title: "🧪 Data quality"
title: "Data quality 🧪"
description: Validate your data and control its quality
keywords: ["dlt+", "data quality", "contracts"]
---

import Link from '../../../_plus_admonition.md';

<Link/>

:::caution
🚧 This feature is under development. Interested in becoming an early tester? [Join dlt+ early access](https://info.dlthub.com/waiting-list).
:::
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/features/quality/tests.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: dlt+ Test utils
keywords: ["dlt+", "data tests", "test"]
---

import Link from '../../../_plus_admonition.md';

<Link/>

## Introduction

dlt+ provides a `pytest` plugin with a set of powerful fixtures and utilities that simplify testing for dlt+ projects. These testing utilities are packaged separately in `dlt-plus-tests`, making it easy to install them as a development dependency. Check the [installation guide](#installation) for instructions on how to install the package.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,6 @@
title: dbt generator
description: Generate dbt models automatically
---
import Link from '../../../_plus_admonition.md';

<Link/>

The **dbt generator** creates scaffolding for dbt projects using data ingested by dlt. It analyzes the pipeline schema and automatically generates staging and fact dbt models. By integrating with dlt-configured destinations, it automates code creation and supports incremental loading, ensuring that only new records are processed in both the ingestion and transformation layers.

Expand Down
3 changes: 0 additions & 3 deletions docs/website/docs/plus/features/transformations/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,9 +4,6 @@ description: Run local transformations with dlt+ Cache
keywords: ["dlt+", "transformations", "cache", "dbt"]
---
import DocCardList from '@theme/DocCardList';
import Link from '../../../_plus_admonition.md';

<Link/>

As part of dlt+, we provide a local transformation [cache](../../core-concepts/cache.md) — a staging layer for data transformations allowing you to test, validate, and debug data pipelines without running everything in the warehouse. With local transformations, you can:

Expand Down
Original file line number Diff line number Diff line change
@@ -1,10 +1,7 @@
---
title: 🧪 Python-based transformations
title: "Python-based transformations 🧪"
description: Define transformations in Python
---
import Link from '../../../_plus_admonition.md';

<Link/>

:::caution
🚧 This feature is under development, and the interface may change in future releases. Interested in becoming an early tester? [Join dlt+ early access](https://info.dlthub.com/waiting-list).
Expand Down
3 changes: 0 additions & 3 deletions docs/website/docs/plus/features/transformations/setup.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,9 +2,6 @@
title: Setup
description: Define and execute local transformations
---
import Link from '../../../_plus_admonition.md';

<Link/>

dlt+ provides a powerful mechanism for executing transformations on your data using a locally spun-up cache. It automatically creates and manages the cache before execution and cleans it up afterward.

Expand Down
34 changes: 16 additions & 18 deletions docs/website/docs/plus/getting-started/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,6 @@ title: Installation
description: Installation information for dlt+
---

import Link from '../../_plus_admonition.md';

<Link/>

:::info Supported Python versions

dlt+ currently supports Python versions 3.9-3.12.
Expand Down Expand Up @@ -36,8 +32,8 @@ pip --version

If you have a different Python version installed or are missing pip, follow the instructions below to update your Python version and/or install `pip`.

<Tabs values={[{"label": "Ubuntu", "value": "ubuntu"}, {"label": "macOS", "value": "macos"}, {"label": "Windows", "value": "windows"}]} groupId="operating-systems" defaultValue="ubuntu">
<TabItem value="ubuntu">
<Tabs values={[{"label": "Ubuntu", "value": "ubuntu"}, {"label": "macOS", "value": "macos"}, {"label": "Windows", "value": "windows"}]} groupId="operating-systems" defaultValue="ubuntu">
<TabItem value="ubuntu">

You can install Python 3.10 with `apt`.

Expand Down Expand Up @@ -74,7 +70,7 @@ C:\> pip3 install -U pip
We recommend working within a [virtual environment](https://docs.python.org/3/library/venv.html) when creating Python projects.
This way, all the dependencies for your current project will be isolated from packages in other projects.

<Tabs values={[{"label": "Ubuntu", "value": "ubuntu"}, {"label": "macOS", "value": "macos"}, {"label": "Windows", "value": "windows"}]} groupId="operating-systems" defaultValue="ubuntu">
<Tabs values={[{"label": "Ubuntu", "value": "ubuntu"}, {"label": "macOS", "value": "macos"}, {"label": "Windows", "value": "windows"}]} groupId="operating-systems" defaultValue="ubuntu">

<TabItem value="ubuntu">

Expand Down Expand Up @@ -139,22 +135,24 @@ Please install a valid license before proceeding, as described under [licensing]
Once you have a valid license, you can make it available to `dlt+` using one of the following methods:

1. **Environment variable**: set the license key as an environment variable:
```sh
export RUNTIME__LICENSE="eyJhbGciOiJSUz...vKSjbEc==="
```

```sh
export RUNTIME__LICENSE="eyJhbGciOiJSUz...vKSjbEc==="
```

2. **Secrets file**: add the license key to a `secrets.toml` file. You can use either the project-level `secrets.toml` (located in `./.dlt/secrets.toml`) or the global one (located in `~/.dlt/secrets.toml`):
```toml
[runtime]
license="eyJhbGciOiJSUz...vKSjbEc==="
```

```toml
[runtime]
license="eyJhbGciOiJSUz...vKSjbEc==="
```

3. **`dlt.yml`**: add the license key directly in the [project manifest file](../features/projects.md) referencing a user-defined environment variable:

```yaml
runtime:
license: {env.MY_ENV_CONTAINING_LICENSE_KEY}
```
```yaml
runtime:
license: { env.MY_ENV_CONTAINING_LICENSE_KEY }
```
You can verify that the license was installed correctly and is valid by running:
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/getting-started/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Using the dlt+ cli commands to create and manage dlt+ Project
keywords: [command line interface, cli, dlt init, dlt+, project]
---

import Link from '../../_plus_admonition.md';

<Link/>

This tutorial introduces you to dlt+ Project and the essential cli commands needed to create and manage it. You will learn how to:

* initialize a new dlt+ Project
Expand Down
17 changes: 6 additions & 11 deletions docs/website/docs/plus/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,18 @@ title: Introduction
description: Introduction to dlt+
---

import Link from '../_plus_admonition.md';

<Link/>

# What is dlt+?

![dlt+](/img/slot-machine-gif.gif)

dlt+ is a framework for running dlt pipelines in production at scale. It is the commercial extension to the open-source data load tool (dlt). dlt+ features include:

* [Project](../plus/features/projects.md): a declarative YAML interface that allows any team member to easily define sources, destinations, and pipelines.
* [Local transformations](../plus/features/transformations/index.md): a staging layer for data transformations, combining a local cache with schema enforcement, debugging tools, and integration with existing data workflows.
* [Data quality & tests](../plus/features/quality/tests.md)
* [Iceberg support](../plus/ecosystem/iceberg.md)
* [Secure data access and sharing](../plus/features/data-access.md)
* [AI workflows](../plus/features/ai.md): agents to augment your data engineering team.
- [Project](../plus/features/projects.md): a declarative YAML interface that allows any team member to easily define sources, destinations, and pipelines.
- [Local transformations](../plus/features/transformations/index.md): a staging layer for data transformations, combining a local cache with schema enforcement, debugging tools, and integration with existing data workflows.
- [Data quality & tests](../plus/features/quality/tests.md)
- [Iceberg support](../plus/ecosystem/iceberg.md)
- [Secure data access and sharing](../plus/features/data-access.md)
- [AI workflows](../plus/features/ai.md): agents to augment your data engineering team.

To get started with dlt+, install the library using pip (Python 3.9-3.12):

Expand All @@ -29,4 +25,3 @@ pip install dlt-plus
:::caution
dlt+ requires a license to run. If you would like a trial, please join our [waiting list](https://info.dlthub.com/waiting-list).
:::

4 changes: 0 additions & 4 deletions docs/website/docs/plus/production/observability.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Observability tooling
keywords: [observability, monitoring, alerting]
---

import Link from '../../_plus_admonition.md';

<Link/>

# Observability

There are several features under development in dlt+ to enhance your observability workflows. These include:
Expand Down
4 changes: 0 additions & 4 deletions docs/website/docs/plus/production/runners.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,10 +4,6 @@ description: Run pipelines in production
keywords: [runners, lambda, airflow]
---

import Link from '../../_plus_admonition.md';

<Link/>

# Runners

With dlt+ you can now run pipelines directly from the command line, allowing you to go to production faster:
Expand Down
7 changes: 1 addition & 6 deletions docs/website/docs/plus/reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,7 @@ description: Command line interface (CLI) full reference of dlt
keywords: [command line interface, cli, dlt init]
---

import Link from '../_plus_admonition.md';

<Link/>


# Command Line Interface Reference
# Command line interface reference

<!-- this page is fully generated from the argparse object of dlt, run make update-cli-docs to update it -->

Expand Down
4 changes: 2 additions & 2 deletions docs/website/docusaurus.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ const versions = {"current": {

let knownVersions = ["current"];
if (fs.existsSync("versions.json")) {
knownVersions = JSON.parse(fs.readFileSync("versions.json"));
knownVersions = JSON.parse(fs.readFileSync("versions.json", 'utf8'));
}

// inject master version renaming only if versions present and master included
Expand Down Expand Up @@ -52,7 +52,7 @@ const config = {
baseUrl: process.env.DOCUSAURUS_BASE_URL || '/docs',
onBrokenLinks: 'throw',
onBrokenMarkdownLinks: 'throw',
onBrokenAnchors: 'throw',
onBrokenAnchors: 'throw',
favicon: 'img/favicon.ico',
staticDirectories: ['public', 'static'],

Expand Down
Loading

0 comments on commit ad96e95

Please sign in to comment.