Skip to content

Releases: brooklyn-data/dbt_artifacts

1.1.2

12 Aug 08:48
77d356e
Compare
Choose a tag to compare

Handles single quotes in exposure descriptions and loaded_at_field in sources.

1.1.1

11 Aug 14:56
0e39831
Compare
Choose a tag to compare

Bug fix for missing database qualifier in two Snowflake create table statement macros.

1.1.0

11 Aug 12:14
e368c93
Compare
Choose a tag to compare

A quick follow up release which adds support for dbt-spark so that the package can be used with Databricks in dbt Cloud, which currently only supports the dbt-spark adapter.

1.0.0

10 Aug 08:33
2ded64f
Compare
Choose a tag to compare

The first major release of dbt_artifacts! 🎉

This is a complete re-write of the package, using the graph context variable and results on-run-end context variable. It solves multiple issues from pre-v1:

  • Overcomes the 16MB variant limit in Snowflake
  • Uses an on-run-end hook which always runs no matter the success of the run, mitigating an issue in dbt Cloud where the upload step wouldn't run if the previous step failed
  • Makes support for further databases much easier (this release adds Databricks support from the outset)

In addition, performance is greatly improved by avoiding the need to process any json files, and all models are views.

Existing dbt_artifacts <1.0.0 user?

Migration guide and run-operation

If you are listing your config variables under a dbt_artifacts key in dbt_project.yml, this key must be removed and the variables listed as top level keys under vars (why?).

Not this:

vars:
  dbt_artifacts:
    dbt_artifacts_database: your_db
    dbt_artifacts_schema: your_schema

This:

vars:
  dbt_artifacts_database: your_db
  dbt_artifacts_schema: your_schema

Contributors

Thank you to:

And all those who have tested and provided feedback on the beta releases!

1.0.0b2

09 Aug 08:39
7468a44
Compare
Choose a tag to compare
1.0.0b2 Pre-release
Pre-release

Incorporates bug fixes, some new columns, dim_dbt__current_models.sql, and a migration tool.

1.0.0b1

21 Jul 21:17
fca2ec1
Compare
Choose a tag to compare
1.0.0b1 Pre-release
Pre-release

The first beta release of 1.0.0.

This is a complete re-write of the package, using the graph context variable and results on-run-end context variable. It solves multiple issues from pre-v1:

  • 16MB variant limit in Snowflake
  • Solves an issue where failing steps would block future steps from running, preventing the data from being uploaded
  • Makes support for further databases much easier (this release adds Databricks support from the outset)
  • Improves performance by avoiding the need to process any json files, all basic node and node execution models are views

While similar, the schemas of 1.0.0 and pre 1.0.0 do differ, so consider this release a fresh install. We'll investigate the possibility of writing migration code for users who wish to retain their existing pre 1.0.0 data.

This release lacks the following models from earlier versions:

If these are still of interest, please feel free to create an issue so that we can get a sense of their usefulness!

Install with:

packages:
  - git: https://github.com/brooklyn-data/dbt_artifacts.git
    revision: 1.0.0b1

0.8.0

19 Apr 12:01
0127c46
Compare
Choose a tag to compare

Stable release incorporating new v2 artifact upload method which flattens artifacts on load, overcoming 16MB variant limit for large projects and more efficient full refreshes.

0.8.0a3

30 Mar 15:42
79109a9
Compare
Choose a tag to compare
0.8.0a3 Pre-release
Pre-release

Prefix all created stages for artifact uploads with the invocation_id, resolving #111

0.8.0a2

21 Mar 14:07
ea64592
Compare
Choose a tag to compare
0.8.0a2 Pre-release
Pre-release

Adds some bug fixes ahead of the final 0.8.0 release

0.8.0a1

22 Feb 09:24
f435114
Compare
Choose a tag to compare
0.8.0a1 Pre-release
Pre-release

This pre-release adds a upload_dbt_artifacts_v2 macro, which both uploads and flattens the manifest.json and run_results.json on load into separate tables to avoid the 16MB variant field limit. To use it, first run the create_artifact_resources run operation which will create the new tables needed.