Skip to content

Commit

Permalink
🎨 [pre-commit.ci] Auto format from pre-commit.com hooks
Browse files Browse the repository at this point in the history
  • Loading branch information
pre-commit-ci[bot] committed Sep 24, 2024
1 parent 1582719 commit 342a7bf
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 11 deletions.
16 changes: 8 additions & 8 deletions docs/getting_started/gcp-cloud-run-job.rst
Original file line number Diff line number Diff line change
Expand Up @@ -16,14 +16,14 @@ Prerequisites
4. Astronomer-cosmos package containing the dbt Cloud Run Job operators
5. GCP account with:
1. A GCP project (`setup guide <https://cloud.google.com/resource-manager/docs/creating-managing-projects#console>`_)
2. IAM roles:
* Basic Role: `Owner <https://cloud.google.com/iam/docs/understanding-roles#owner>`_ (control over whole project) or
2. IAM roles:
* Basic Role: `Owner <https://cloud.google.com/iam/docs/understanding-roles#owner>`_ (control over whole project) or
* Predefined Roles: `Artifact Registry Administrator <https://cloud.google.com/iam/docs/understanding-roles#artifactregistry.admin>`_, `Cloud Run Developer <https://cloud.google.com/iam/docs/understanding-roles#run.developer>`_ (control over specific services)
3. Enabled service APIs:
* Artifact Registry API
* Artifact Registry API
* Cloud Run Admin API
* BigQuery API
4. A service account with BigQuery roles: `JobUser <https://cloud.google.com/iam/docs/understanding-roles#bigquery.jobUser>`_ and `DataEditor <https://cloud.google.com/iam/docs/understanding-roles#bigquery.dataEditor>`_
4. A service account with BigQuery roles: `JobUser <https://cloud.google.com/iam/docs/understanding-roles#bigquery.jobUser>`_ and `DataEditor <https://cloud.google.com/iam/docs/understanding-roles#bigquery.dataEditor>`_
6. Docker image built with required dbt project and dbt DAG
7. dbt DAG with Cloud Run Job operators in the Airflow DAGs directory to run in Airflow

Expand Down Expand Up @@ -85,7 +85,7 @@ In case BigQuery has never been used before in the project, run below command to
**Setup Artifact Registry**

In order to run a container in Cloud Run Job, it needs access to the container image. In our setup, we will use Artifact Registry repository that stores images.
In order to run a container in Cloud Run Job, it needs access to the container image. In our setup, we will use Artifact Registry repository that stores images.
To use Artifact Registry, you need to enable the API first:

.. code-block:: bash
Expand Down Expand Up @@ -177,7 +177,7 @@ First, enable Cloud Run Admin API using below command:

.. code-block:: bash
gcloud services enable run.googleapis.com
gcloud services enable run.googleapis.com
Next, set default Cloud Run region to your GCP region:
Expand Down Expand Up @@ -243,7 +243,7 @@ You can also verify the tables that were created using dbt in BigQuery Studio:
After the successfull tests, don't forget to delete Google Cloud resources to save up costs:

.. code-block:: bash
# Delete Cloud Run Job instance
gcloud run jobs delete $CLOUD_RUN_JOB_NAME
Expand All @@ -261,4 +261,4 @@ After the successfull tests, don't forget to delete Google Cloud resources to sa
# Delete Artifact Registry repository with all images included
gcloud artifacts repositories delete $REPO_NAME \
--location=$REGION
--location=$REGION
6 changes: 3 additions & 3 deletions tests/operators/test_gcp_cloud_run_job.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,14 +9,14 @@

try:
from cosmos.operators.gcp_cloud_run_job import (
DbtBuildGcpCloudRunJobOperator,
DbtGcpCloudRunJobBaseOperator,
DbtLSGcpCloudRunJobOperator,
DbtRunGcpCloudRunJobOperator,
DbtRunOperationGcpCloudRunJobOperator,
DbtSeedGcpCloudRunJobOperator,
DbtTestGcpCloudRunJobOperator,
DbtBuildGcpCloudRunJobOperator,
DbtSnapshotGcpCloudRunJobOperator,
DbtRunOperationGcpCloudRunJobOperator,
DbtTestGcpCloudRunJobOperator,
)

class ConcreteDbtGcpCloudRunJobOperator(DbtGcpCloudRunJobBaseOperator):
Expand Down

0 comments on commit 342a7bf

Please sign in to comment.