Skip to content

Commit

Permalink
Merge branch 'current' into upgrade/guides-v2
Browse files Browse the repository at this point in the history
  • Loading branch information
john-rock authored Feb 19, 2025
2 parents 1b85cbb + 17ee039 commit 832d13b
Show file tree
Hide file tree
Showing 18 changed files with 101 additions and 33 deletions.
8 changes: 7 additions & 1 deletion website/blog/2025-01-23-levels-of-sql-comprehension.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,10 @@ date: 2025-01-23
is_featured: true
---

:::note
This is part one of a series, for the second article, see [The key technologies behind SQL Comprehension](/blog/sql-comprehension-technologies)
:::


Ever since [dbt Labs acquired SDF Labs last week](https://www.getdbt.com/blog/dbt-labs-acquires-sdf-labs), I've been head-down diving into their technology and making sense of it all. The main thing I knew going in was "SDF understands SQL". It's a nice pithy quote, but the specifics are *fascinating.*

Expand Down Expand Up @@ -145,6 +149,8 @@ In introducing these concepts, we’re still just scratching the surface. There'
- How this is all going to roll into a step change in the experience of working with data
- What it means for doing great data work

Over the coming days, you'll be hearing more about all of this from the dbt Labs team - both familiar faces and our new friends from SDF Labs.
To learn more, check out [The key technologies behind SQL Comprehension](/blog/sql-comprehension-technologies).

Over the coming days, you'll hear more about all of this from the dbt Labs team - both familiar faces and our new friends from SDF Labs.

This is a special moment for the industry and the community. It's alive with possibilities, with ideas, and with new potential. We're excited to navigate this new frontier with all of you.
6 changes: 4 additions & 2 deletions website/docs/docs/build/incremental-strategy.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,11 @@ The [`microbatch` incremental strategy](/docs/build/incremental-microbatch) is i

### Supported incremental strategies by adapter

This table represents the availability of each incremental strategy, based on the latest version of dbt Core and each adapter.
This table shows the support of each incremental strategy across adapters available on dbt Cloud's [Latest release track](/docs/dbt-versions/cloud-release-tracks). Some strategies may be unavailable if you're not on "Latest" and the feature hasn't been released to the "Compatible" track.

Click the name of the adapter in the below table for more information about supported incremental strategies.
If you're interested in an adapter available in dbt Core only, check out the [adapter's individual configuration page](/reference/resource-configs/resource-configs) for more details.

Click the name of the adapter in the following table for more information about supported incremental strategies:

| Data platform adapter | `append` | `merge` | `delete+insert` | `insert_overwrite` | `microbatch` |
|-----------------------|:--------:|:-------:|:---------------:|:------------------:|:-------------------:|
Expand Down
10 changes: 7 additions & 3 deletions website/docs/docs/cloud/about-cloud-develop-defer.md
Original file line number Diff line number Diff line change
Expand Up @@ -35,12 +35,16 @@ When using defer, it compares artifacts from the most recent successful producti

### Defer in the dbt Cloud IDE

To enable defer in the dbt Cloud IDE, toggle the **Defer to production** button on the command bar. Once enabled, dbt Cloud will:
To use deferral in the IDE, you must have production artifacts generated by a deploy job. dbt Cloud will first check for these artifacts in your Staging environment (if available), or else in the Production environment.

1. Pull down the most recent manifest from the Production environment for comparison
The defer feature in the IDE won't work if a Staging environment exists but no deploy job has run. This is because the necessary metadata to power defer won't exist until a deploy job has run successfully in the Staging environment.

To enable defer in the dbt Cloud IDE, toggle the **Defer to staging/production** button on the command bar. Once enabled, dbt Cloud will:

1. Pull down the most recent manifest from the Staging or Production environment for comparison
2. Pass the `--defer` flag to the command (for any command that accepts the flag)

For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` — only the edited model would run. Any `{{ ref() }}` functions will point to the production location of the referenced models.
For example, if you were to start developing on a new branch with [nothing in your development schema](/reference/node-selection/defer#usage), edit a single model, and run `dbt build -s state:modified` — only the edited model would run. Any `{{ ref() }}` functions will point to the staging or production location of the referenced models.

<Lightbox src="/img/docs/dbt-cloud/defer-toggle.jpg" width="100%" title="Select the 'Defer to production' toggle on the bottom right of the command bar to enable defer in the dbt Cloud IDE."/>

Expand Down
16 changes: 12 additions & 4 deletions website/docs/docs/cloud/use-visual-editor.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,15 +37,23 @@ To access the visual editor:
To create a dbt SQL model, click on **Create a new model** and perform the following steps. Note that you can't create source models in the visual editor. This is because you need to have production run with sources already created.

1. Drag an operator from the operator toolbar and drop it onto the canvas.
2. Click on the operator to open its configuration panel:
- **Model**: Select the model and columns you want to use.
2. Click on the operator to open its configuration panel:
#### Input
- **Input model**: Select the model and columns you want to use.
<br />
#### Transform
- **Join**: Define the join conditions and choose columns from both tables.
- **Select**: Pick the columns you need from the model.
- **Aggregate**: Specify the aggregation functions and the columns they apply to.
- **Formula**: Add the formula to create a new column. Includes support for Jinja expressions and macros normally called in `SELECT` statements. Use the built-AI code generator to help generate SQL code by clicking on the question mark (?) icon. Enter your prompt and wait to see the results.
- **Formula**: Add the formula to create a new column. Use the built-AI code generator to help generate SQL code by clicking on the question mark (?) icon. Enter your prompt and wait to see the results.
- **Filter**: Set the conditions to filter data.
- **Order**: Select the columns to sort by and the sort order.
- **Limit**: Set the maximum number of rows you want to return.
- **Limit**: Set the maximum number of rows you want to return.
<br />
#### Output model
- **Output model**: The final transformed dataset generated by a dbt model.

Currently, you can only have one output model in the visual editor, but in the future, it'll be possible to have multiple output models.
3. View the **Output** and **SQL Code** tabs.
- Each operator has an Output tab that allows you to preview the data from that configured node.
- The Code tab displays the SQL code generated by the node's configuration. Use this to see the SQL for your visual model config.
Expand Down
25 changes: 17 additions & 8 deletions website/docs/docs/cloud/visual-editor-interface.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,14 +36,23 @@ The operator toolbar above the canvas contains the different transformation oper
<Lightbox src="/img/docs/dbt-cloud/visual-editor/edit-model.png" width="90%" title="Use the operator toolbar to perform different transformation operations." />

Here the following operators are available:
- **Model**: This represents a data model. Use this to select the source table and the columns you want to include. There are no limits to the number of models you can have in a session.
- **Join**: Join two models and configure the join conditions by selecting which columns to include from each table. Requires two inputs. For example, you might want to join both tables using the 'ID' column found in both tables.
- **Select**: Use this to 'select' specific columns from a table.
- **Aggregate**: Allows you to perform aggregations like GROUP, SUM, AVG, COUNT, and so on.
- **Formula**: Create new columns using custom SQL formulas. Use a built-in AI code generator to generate SQL by clicking the ? icon. For example, you can use the formula node to only extract the email domain and ask the AI code generator to help you write the SQL for that code extraction.
- **Filter**: Filter data based on conditions you set.
- **Order**: Sort data by specific columns.
- **Limit**: Limits the number of rows returned back.

#### Input
- **Input model**: Select the model and columns you want to use.

#### Transform
- **Join**: Define the join conditions and choose columns from both tables.
- **Select**: Pick the columns you need from the model.
- **Aggregate**: Specify the aggregation functions and the columns they apply to.
- **Formula**: Add the formula to create a new column. Use the built-AI code generator to help generate SQL code by clicking on the question mark (?) icon. Enter your prompt and wait to see the results.
- **Filter**: Set the conditions to filter data.
- **Order**: Select the columns to sort by and the sort order.
- **Limit**: Set the maximum number of rows you want to return.

#### Output model
- **Output model**: The final transformed dataset generated by a dbt model.

Currently, you can only have one output model in the visual editor, but in the future, it'll be possible to have multiple output models.

When you click on each operator, it opens a configuration panel. The configuration panel allows you to configure the operator, review the current model, preview changes to the model, view the SQL code for the node, and delete the operator.

Expand Down
1 change: 1 addition & 0 deletions website/docs/docs/dbt-versions/release-notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ Release notes are grouped by month for both multi-tenant and virtual private clo

## February 2025

- **Enhancement**: The [Python SDK](/docs/dbt-cloud-apis/sl-python) added a new timeout parameter to Semantic Layer client and to underlying GraphQL clients to specify timeouts. Set a timeout number or use the `total_timeout` parameter in the global `TimeoutOptions` to control connect, execute and close timeouts granularly. `ExponentialBackoff.timeout_ms` is now deprecated.
- **New**: The [Azure DevOps](/docs/cloud/git/connect-azure-devops) integration for Git now supports [Entra service principal apps](/docs/cloud/git/setup-service-principal) on dbt Cloud Enterprise accounts. Microsoft is enforcing MFA across user accounts, including service users, which will impact existing app integrations. This is a phased rollout, and dbt Labs recommends [migrating to a service principal](/docs/cloud/git/setup-service-principal#migrate-to-service-principal) on existing integrations once the option becomes available in your account.

## January 2025
Expand Down
15 changes: 8 additions & 7 deletions website/docs/faqs/Git/git-migration.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,18 +9,19 @@ tags: [Git]

To migrate from one git provider to another, refer to the following steps to avoid minimal disruption:

1. Outside of dbt Cloud, you'll need to import your existing repository into your new provider.
1. Outside of dbt Cloud, you'll need to import your existing repository into your new provider. By default, connecting your repository in one account won't automatically disconnected it from another account.

As an example, if you're migrating from GitHub to Azure DevOps, you'll need to import your existing repository (GitHub) into your new git provider (Azure DevOps). For detailed steps on how to do this, refer to your git provider's documentation (Such as [GitHub](https://docs.github.com/en/migrations/importing-source-code/using-github-importer/importing-a-repository-with-github-importer), [GitLab](https://docs.gitlab.com/ee/user/project/import/repo_by_url.html), [Azure DevOps](https://learn.microsoft.com/en-us/azure/devops/repos/git/import-git-repository?view=azure-devops))
As an example, if you're migrating from GitHub to Azure DevOps, you'll need to import your existing repository (GitHub) into your new Git provider (Azure DevOps). For detailed steps on how to do this, refer to your Git provider's documentation (Such as [GitHub](https://docs.github.com/en/migrations/importing-source-code/using-github-importer/importing-a-repository-with-github-importer), [GitLab](https://docs.gitlab.com/ee/user/project/import/repo_by_url.html), [Azure DevOps](https://learn.microsoft.com/en-us/azure/devops/repos/git/import-git-repository?view=azure-devops))

2. Go back to dbt Cloud and set up your [integration for the new git provider](/docs/cloud/git/connect-github), if needed.
3. Disconnect the old repository in dbt Cloud by going to **Account Settings** and then **Projects**. Click on the **Repository** link, then click **Edit** and **Disconnect**.
2. Go back to dbt Cloud and set up your [integration for the new Git provider](/docs/cloud/git/git-configuration-in-dbt-cloud), if needed.
3. Disconnect the old repository in dbt Cloud by going to **Account Settings** and then **Projects**.
4. Click on the **Repository** link, then click **Edit** and **Disconnect**.

<Lightbox src="/img/docs/dbt-cloud/disconnect-repo.png" title="Disconnect and reconnect your git repository in your dbt Cloud Account Settings pages."/>
<Lightbox src="/img/docs/dbt-cloud/disconnect-repo.png" width="80%" title="Disconnect and reconnect your Git repository in your dbt Cloud Account settings page."/>

4. On the same page, connect to the new git provider repository by clicking **Configure Repository**
5. On the same page, connect to the new Git provider repository by clicking **Configure Repository**
- If you're using the native integration, you may need to OAuth to it.

5. That's it, you should now be connected to the new git provider! 🎉
6. That's it, you should now be connected to the new Git provider! 🎉

Note &mdash; As a tip, we recommend you refresh your page and dbt Cloud IDE before performing any actions.
4 changes: 2 additions & 2 deletions website/docs/reference/configs-and-properties.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,14 +28,14 @@ Depending on the resource type, configurations can be defined in the dbt project

<VersionBlock firstVersion="1.9">

1. Using a [`config` property](/reference/resource-properties/config) in a `.yml` file in the `models/`, `snapshots/`, `seeds/`, `analyses`, or `tests/` directory
1. Using a [`config` property](/reference/resource-properties/config) in a `.yml` file for supported resource directories like `models/`, `snapshots/`, `seeds/`, `analyses`, `tests/`, and more.
2. From the [`dbt_project.yml` file](dbt_project.yml), under the corresponding resource key (`models:`, `snapshots:`, `tests:`, etc)
</VersionBlock>

<VersionBlock lastVersion="1.8">

1. Using a [`config()` Jinja macro](/reference/dbt-jinja-functions/config) within a `model`, `snapshot`, or `test` SQL file
2. Using a [`config` property](/reference/resource-properties/config) in a `.yml` file in the `models/`, `snapshots/`, `seeds/`, `analyses/`, or `tests/` directory.
2. Using a [`config` property](/reference/resource-properties/config) in a `.yml` file for supported resource directories like `models/`, `snapshots/`, `seeds/`, `analyses/`, or `tests/` directory.
3. From the [`dbt_project.yml` file](dbt_project.yml), under the corresponding resource key (`models:`, `snapshots:`, `tests:`, etc)
</VersionBlock>

Expand Down
10 changes: 10 additions & 0 deletions website/docs/reference/resource-configs/_category.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# position: 2.5 # float position is supported
label: 'Resource configs'
collapsible: true # make the category collapsible
collapsed: true # keep the category collapsed by default
className: red
link:
type: generated-index
title: Resource configs
customProps:
description: Platform-specific configs are used to configure the dbt project for a specific database platform.
29 changes: 25 additions & 4 deletions website/docs/reference/resource-configs/begin.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,15 @@ datatype: string

## Definition

Set the `begin` config to the timestamp value at which your [microbatch incremental model](/docs/build/incremental-microbatch) data should begin &mdash; at the point the data becomes relevant for the microbatch model. You can configure `begin` for a [model](/docs/build/models) in your `dbt_project.yml` file, property YAML file, or config block. The value for `begin` must be a string representing an ISO-formatted date _or_ date and time.
Set the `begin` config to the timestamp value at which your [microbatch incremental model](/docs/build/incremental-microbatch) data should begin &mdash; at the point the data becomes relevant for the microbatch model.

You can configure `begin` for a [model](/docs/build/models) in your `dbt_project.yml` file, property YAML file, or config block. The value for `begin` must be a string representing an ISO-formatted date, _or_ date and time, _or_ [relative dates](#set-begin-to-use-relative-dates). Check out the [examples](#examples) in the next section for more details.

## Examples

The following examples set `2024-01-01 00:00:00` as the `begin` config for the `user_sessions` model.

Example in the `dbt_project.yml` file:
#### Example in the `dbt_project.yml` file

<File name='dbt_project.yml'>

Expand All @@ -29,7 +31,7 @@ models:
```
</File>
Example in a properties YAML file:
#### Example in a properties YAML file
<File name='models/properties.yml'>
Expand All @@ -42,7 +44,7 @@ models:
</File>
Example in sql model config block:
#### Example in sql model config block
<File name="models/user_sessions.sql">
Expand All @@ -53,3 +55,22 @@ Example in sql model config block:
```

</File>

#### Set `begin` to use relative dates

To configure `begin` to use relative dates, you can use modules variables [`modules.datetime`](/reference/dbt-jinja-functions/modules#datetime) and [`modules.pytz`](/reference/dbt-jinja-functions/modules#pytz) to dynamically specify relative timestamps, such as yesterday's date or the start of the current week.

For example, to set `begin` to yesterday's date:

```sql
{{
config(
materialized = 'incremental',
incremental_strategy='microbatch',
unique_key = 'run_id',
begin=(modules.datetime.datetime.now() - modules.datetime.timedelta(1)).isoformat(),
event_time='created_at',
batch_size='day',
)
}}
```
Empty file.
1 change: 0 additions & 1 deletion website/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

7 changes: 7 additions & 0 deletions website/sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -894,6 +894,13 @@ const sidebarSettings = {
{
type: "category",
label: "Platform-specific configs",
link: {
type: "generated-index",
title: "Platform-specific configs",
description:
"Platform-specific configs are used to configure the dbt project for a specific database platform.",
slug: "/reference/resource-configs/resource-configs",
},
items: [
"reference/resource-configs/athena-configs",
"reference/resource-configs/impala-configs",
Expand Down
2 changes: 1 addition & 1 deletion website/snippets/_enterprise-permissions-table.md
Original file line number Diff line number Diff line change
Expand Up @@ -112,7 +112,7 @@ Key:
| Jobs | W | R* | R* | R* | R* | W | R | R | - | - | R | R* | - |
| Metadata GraphQL API access| R | R | R | R | R | R | - | R | R | - | R | R | - |
| Permissions | W | - | R | R | R | - | - | - | - | - | - | R | - |
| Projects | W | W | W | W | W | R | - | R | - | - | R | W | - |
| Projects | W | R | W | W | W | R | - | R | - | - | R | W | - |
| Repositories | W | - | R | R | W | - | - | - | - | - | R | R | - |
| Runs | W | R* | R* | R* | R* | W | W | R | - | - | R | R* | - |
| Semantic Layer config | W | R | W | R | R | R | - | - | - | W | R | R | - |
Expand Down
Binary file modified website/static/img/docs/dbt-cloud/visual-editor/config-panel.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified website/static/img/docs/dbt-cloud/visual-editor/connector.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified website/static/img/docs/dbt-cloud/visual-editor/operator.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.

0 comments on commit 832d13b

Please sign in to comment.