Skip to content

Latest commit

 

History

History
49 lines (33 loc) · 4.64 KB

ARCHITECTURE.md

File metadata and controls

49 lines (33 loc) · 4.64 KB

The core function of dbt is SQL compilation and execution. Users create projects of dbt resources (models, tests, seeds, snapshots, ...), defined in SQL and YAML files, and they invoke dbt to create, update, or query associated views and tables. Today, dbt makes heavy use of Jinja2 to enable the templating of SQL, and to construct a DAG (Directed Acyclic Graph) from all of the resources in a project. Users can also extend their projects by installing resources (including Jinja macros) from other projects, called "packages."

dbt-core

Most of the python code in the repository is within the core/dbt directory. Currently the main subdirectories are:

  • adapters: Define base classes for behavior that is likely to differ across databases
  • clients: Interface with dependencies (agate, jinja) or across operating systems
  • config: Reconcile user-supplied configuration from connection profiles, project files, and Jinja macros
  • context: Build and expose dbt-specific Jinja functionality
  • contracts: Define Python objects (dataclasses) that dbt expects to create and validate
  • deps: Package installation and dependency resolution
  • graph: Produce a networkx DAG of project resources, and selecting those resources given user-supplied criteria
  • include: The dbt "global project," which defines default implementations of Jinja2 macros
  • parser: Read project files, validate, construct python objects
  • rpc: Provide remote procedure call server for invoking dbt, following JSON-RPC 2.0 spec
  • task: Set forth the actions that dbt can perform when invoked

Invoking dbt

There are two supported ways of invoking dbt: from the command line and using an RPC server.

The "tasks" map to top-level dbt commands. So dbt run => task.run.RunTask, etc. Some are more like abstract base classes (GraphRunnableTask, for example) but all the concrete types outside of task/rpc should map to tasks. Currently one executes at a time. The tasks kick off their “Runners” and those do execute in parallel. The parallelism is managed via a thread pool, in GraphRunnableTask.

core/dbt/include/index.html This is the docs website code. It comes from the dbt-docs repository, and is generated when a release is packaged.

Adapters

dbt uses an adapter-plugin pattern to extend support to different databases, warehouses, query engines, etc. The four core adapters that are in the main repository, contained within the plugins subdirectory, are: Postgres Redshift, Snowflake and BigQuery. Other warehouses use adapter plugins defined in separate repositories (e.g. dbt-spark, dbt-presto).

Each adapter is a mix of python, Jinja2, and SQL. The adapter code also makes heavy use of Jinja2 to wrap modular chunks of SQL functionality, define default implementations, and allow plugins to override it.

Each adapter plugin is a standalone python package that includes:

  • dbt/include/[name]: A "sub-global" dbt project, of YAML and SQL files, that reimplements Jinja macros to use the adapter's supported SQL syntax
  • dbt/adapters/[name]: Python modules that inherit, and optionally reimplement, the base adapter classes defined in dbt-core
  • setup.py

The Postgres adapter code is the most central, and many of its implementations are used as the default defined in the dbt-core global project. The greater the distance of a data technology from Postgres, the more its adapter plugin may need to reimplement.

Testing dbt

The test/ subdirectory includes unit and integration tests that run as continuous integration checks against open pull requests. Unit tests check mock inputs and outputs of specific python functions. Integration tests perform end-to-end dbt invocations against real adapters (Postgres, Redshift, Snowflake, BigQuery) and assert that the results match expectations. See the contributing guide for a step-by-step walkthrough of setting up a local development and testing environment.

Everything else

  • docker: All dbt versions are published as Docker images on DockerHub. This subfolder contains the Dockerfile (constant) and requirements.txt (one for each version).
  • etc: Images for README
  • scripts: Helper scripts for testing, releasing, and producing JSON schemas. These are not included in distributions of dbt, not are they rigorously tested—they're just handy tools for the dbt maintainers :)