Skip to content

databricks/terraform-provider-databricks

This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.

Folders and files

NameName
Last commit message
Last commit date

Latest commit

0d49760 · Dec 9, 2024
Dec 9, 2024
Feb 20, 2023
Nov 21, 2024
Jul 20, 2023
Aug 14, 2024
Oct 20, 2024
Nov 12, 2024
Dec 4, 2024
Nov 29, 2024
Aug 29, 2024
Nov 20, 2024
Nov 20, 2024
Dec 3, 2024
Dec 4, 2024
Oct 2, 2024
Dec 9, 2024
Nov 13, 2024
Aug 29, 2024
Aug 20, 2024
Oct 3, 2024
Nov 21, 2024
Nov 13, 2024
Nov 6, 2024
Jul 9, 2024
Aug 31, 2024
Nov 19, 2024
Oct 3, 2024
Aug 22, 2024
Mar 5, 2024
Jul 9, 2024
Jul 19, 2024
Sep 3, 2024
Sep 3, 2024
Nov 22, 2024
Dec 3, 2024
Jul 9, 2024
Nov 20, 2024
Nov 8, 2024
Nov 29, 2024
Aug 19, 2020
Aug 4, 2020
Aug 22, 2024
Apr 16, 2024
Jul 4, 2024
Jul 4, 2024
Nov 20, 2024
Oct 4, 2023
Oct 29, 2021
Nov 14, 2024
Jul 15, 2022
Aug 27, 2024
Apr 10, 2020
Aug 27, 2024
Mar 31, 2023
May 10, 2021
Dec 5, 2024
Nov 28, 2024
May 2, 2024
Aug 27, 2024
May 3, 2022
Nov 18, 2024

Repository files navigation

Databricks Terraform Provider

Resources

Troubleshooting Guide | AWS tutorial | Azure tutorial | End-to-end tutorial | Changelog | Authentication | databricks_aws_assume_role_policy data | databricks_aws_bucket_policy data | databricks_aws_crossaccount_policy data | databricks_catalog | databricks_catalogs data | databricks_cluster | databricks_clusters data | databricks_cluster_policy | databricks_current_user | databricks_dbfs_file | databricks_dbfs_file_paths data | databricks_dbfs_file data | databricks_directory | databricks_external_location | databricks_git_credential | databricks_global_init_script | databricks_grant | databricks_grants | databricks_group | databricks_group data | databricks_group_instance_profile | databricks_group_member | databricks_instance_pool | databricks_instance_profile | databricks_ip_access_list | databricks_job | databricks_job data | databricks_jobs | databricks_lakehouse_monitor | databricks_library | databricks_metastore | databricks_metastore_assignment | databricks_metastore_data_access | databricks_mlflow_model | databricks_mlflow_experiment | databricks_mlflow_webhook | databricks_model_serving | databricks_mount | databricks_mws_credentials | databricks_mws_customer_managed_keys | databricks_mws_log_delivery | databricks_mws_networks | databricks_mws_permission_assignment | databricks_mws_private_access_settings | databricks_mws_storage_configurations | databricks_mws_vpc_endpoint | databricks_mws_workspaces | databricks_mws_workspaces data | databricks_node_type data | databricks_notebook | databricks_notebook data | databricks_notebook_paths data | databricks_obo_token | databricks_permissions | databricks_pipeline | databricks_pipelines data | databricks_quality_monitor | databricks_repo | databricks_schema | databricks_schemas data | databricks_secret | databricks_secret_acl | databricks_secret_scope | databricks_service_principal | databricks_service_principals data | databricks_service_principal_role | databricks_spark_version data | databricks_sql_dashboard | databricks_sql_endpoint | databricks_sql_global_config | databricks_sql_permissions | databricks_sql_query | databricks_sql_visualization | databricks_sql_warehouse data | databricks_sql_warehouses data | databricks_sql_widget | databricks_storage_credential | databricks_tables data | databricks_token | databricks_user | databricks_user_role | databricks_user_instance_profile | databricks_views data | databricks_volume | databricks_workspace_conf | databricks_zones | Contributing and Development Guidelines

build codecov lines downloads

Databricks Terraform provider works with Terraform 1.0, or newer. To use it please refer to instructions specified at registry page:

terraform {
  required_providers {
    databricks = {
      source = "databricks/databricks"
    }
  }
}

If you want to build it from sources, please refer to contributing guidelines.

Then create a small sample file, named main.tf with approximately following contents. Replace <your PAT token> with newly created PAT Token.

provider "databricks" {
  host  = "https://abc-defg-024.cloud.databricks.com/"
  token = "<your PAT token>"
}

data "databricks_current_user" "me" {}
data "databricks_spark_version" "latest" {}
data "databricks_node_type" "smallest" {
  local_disk = true
}

resource "databricks_notebook" "this" {
  path     = "${data.databricks_current_user.me.home}/Terraform"
  language = "PYTHON"
  content_base64 = base64encode(<<-EOT
    # created from ${abspath(path.module)}
    display(spark.range(10))
    EOT
  )
}

resource "databricks_job" "this" {
  name = "Terraform Demo (${data.databricks_current_user.me.alphanumeric})"

  new_cluster {
    num_workers   = 1
    spark_version = data.databricks_spark_version.latest.id
    node_type_id  = data.databricks_node_type.smallest.id
  }

  notebook_task {
    notebook_path = databricks_notebook.this.path
  }
}

output "notebook_url" {
  value = databricks_notebook.this.url
}

output "job_url" {
  value = databricks_job.this.url
}

Then run terraform init then terraform apply to apply the hcl code to your Databricks workspace.

OpenTofu Support

OpenTofu is an open-source fork of Terraform with the MPL 2.0 license. The Databricks Terraform provider should be compatible with OpenTofu, but this integration is not actively tested and should be considered experimental. Please raise a Github issue if you find any incompatibility.

Switching from databrickslabs to databricks namespace

To make Databricks Terraform Provider generally available, we've moved it from https://github.com/databrickslabs to https://github.com/databricks. We've worked closely with the Terraform Registry team at Hashicorp to ensure a smooth migration. Existing terraform deployments continue to work as expected without any action from your side. We ask you to replace databrickslabs/databricks with databricks/databricks in all your .tf files.

You should have .terraform.lock.hcl file in your state directory that is checked into source control. terraform init will give you the following warning.

Warning: Additional provider information from registry 

The remote registry returned warnings for registry.terraform.io/databrickslabs/databricks:
- For users on Terraform 0.13 or greater, this provider has moved to databricks/databricks. Please update your source in required_providers.

After you replace databrickslabs/databricks with databricks/databricks in the required_providers block, the warning will disappear. Do a global "search and replace" in *.tf files. Alternatively you can run python3 -c "$(curl -Ls https://dbricks.co/updtfns)" from the command-line, that would do all the boring work for you.

If you didn't check-in .terraform.lock.hcl to the source code version control, you may you may see Failed to install provider error. Please follow the simple steps described in the troubleshooting guide.

Use of Terraform exporter

The exporter functionality is experimental and provided as is. It has an evolving interface, which may change or be removed in future versions of the provider.