-
Notifications
You must be signed in to change notification settings - Fork 98
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[FSTORE-933] DBT Scheduling with External FG (#176)
* Tutorial for DBT with External FG
- Loading branch information
Showing
14 changed files
with
288 additions
and
395 deletions.
There are no files selected for viewing
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Binary file not shown.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
This file was deleted.
Oops, something went wrong.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,23 @@ | ||
def model(dbt, session): | ||
# Setup cluster usage | ||
dbt.config( | ||
submission_method="cluster", | ||
dataproc_cluster_name="{YOUR_DATAPROC_CLUSTER_NAME}", | ||
) | ||
|
||
# Read data_pipeline Python model | ||
data_pipeline = dbt.ref("data_pipeline") | ||
|
||
# Define the list of columns to drop | ||
columns_to_drop = ['index_column', 'hour', 'day', 'temperature_diff', 'wind_speed_category'] | ||
|
||
# Drop the specified columns | ||
data_pipeline = data_pipeline.drop(*columns_to_drop) | ||
|
||
# Write data to BigQuery table | ||
data_pipeline.write.format('bigquery') \ | ||
.option('table', '{YOUR_DATASET_NAME}.{YOUR_TABLE_NAME}') \ | ||
.mode('append') \ | ||
.save() | ||
|
||
return data_pipeline |
Oops, something went wrong.