-
Notifications
You must be signed in to change notification settings - Fork 397
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ISSUE] databricks_storage_credential: No API found for PUT even with latest provider version 1.34 #3134
Comments
if you want to use account-level provider, then you don't have
in the resource block, so it uses databricks provider attached to the workspace |
In the issue I had shortend the information. The configuration is actually executed by a child module which has only 1 databricks provider. The root module containing the provider configurations passes over only the databricks.account level provider to the child module, so no
should be required in the resource block in the child module. root module
child module
|
Same issue with AWS, provider version variable "unity_catalogs" {
type = set(string)
default = ["staging"]
}
# Account level catalog owners
resource "databricks_service_principal" "catalog" {
for_each = var.unity_catalogs
display_name = "unity_catalog_${each.value}_sp"
provider = databricks.account
}
resource "databricks_storage_credential" "catalog_staging" {
name = "catalog_staging"
owner = databricks_service_principal.catalog["staging"].application_id
aws_iam_role {
role_arn = "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
}
provider = databricks.account
} Logs:
where |
Setting resource "databricks_storage_credential" "catalog_staging" {
name = "catalog_staging"
owner = databricks_service_principal.catalog["staging"].application_id
metastore_id = databricks_metastore.unity_metastore.id
aws_iam_role {
role_arn = "arn:aws:iam::1234567890:role/MyRole-AJJHDSKSDF"
}
provider = databricks.account
} However, setting
Logs:
For some reason Also, I think |
After having added Provider fails resource "databricks_storage_credential" "datastore_credentials" {
name = local.uc_datastore_name
metastore_id = var.db_metastore_id
azure_managed_identity {
access_connector_id = azurerm_databricks_access_connector.dac_datastore.id
managed_identity_id = azurerm_user_assigned_identity.uai_dac.id
}
owner = var.metastore_admin_group_display_name
comment = "Managed by TF"
} DEBUG
|
#3184 fixes the issue |
I have exactly the same issue as decribed here, so I don't repeat all details: #2697
But I still have this problem with the latest provider version 1.34
Configuration
Steps to Reproduce
Terraform and provider versions
Debug Output
Important Factoids
I'm using Azure Databricks
I'm deploying the resource using an Account-level provider
The Service Principal deploying the Storage Credential is an Account & Metastore Admin
The text was updated successfully, but these errors were encountered: