Skip to content

Commit

Permalink
fixed merge conflicts
Browse files Browse the repository at this point in the history
  • Loading branch information
dgomez04 committed Dec 16, 2024
2 parents fcd33dd + f716018 commit 7672d40
Show file tree
Hide file tree
Showing 135 changed files with 111,876 additions and 2,799 deletions.
3 changes: 0 additions & 3 deletions .codegen.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
{
"formatter": "make fmt",
"mode": "tf_v1",
"packages": {
".codegen/model.go.tmpl": "internal/service/{{.Name}}_tf/model.go"
},
"changelog_config": ".codegen/changelog_config.yml",
"version": {
"common/version.go": "version = \"$VERSION\""
Expand Down
2 changes: 1 addition & 1 deletion .codegen/_openapi_sha
Original file line number Diff line number Diff line change
@@ -1 +1 @@
f2385add116e3716c8a90a0b68e204deb40f996c
7016dcbf2e011459416cf408ce21143bcc4b3a25
103 changes: 0 additions & 103 deletions .codegen/model.go.tmpl

This file was deleted.

1 change: 1 addition & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
internal/service/apps_tf/model.go linguist-generated=true
internal/service/billing_tf/model.go linguist-generated=true
internal/service/catalog_tf/model.go linguist-generated=true
internal/service/cleanrooms_tf/model.go linguist-generated=true
internal/service/compute_tf/model.go linguist-generated=true
internal/service/dashboards_tf/model.go linguist-generated=true
internal/service/files_tf/model.go linguist-generated=true
Expand Down
2 changes: 1 addition & 1 deletion .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,5 +9,5 @@ How is this tested? Please see the checklist below and also describe any other r
- [ ] `make test` run locally
- [ ] relevant change in `docs/` folder
- [ ] covered with integration tests in `internal/acceptance`
- [ ] relevant acceptance tests are passing
- [ ] using Go SDK
- [ ] using TF Plugin Framework
73 changes: 73 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,78 @@
# Version changelog

## [Release] Release v1.61.0

### New Features and Improvements

* Add `databricks_app` resource and data source ([#4099](https://github.com/databricks/terraform-provider-databricks/pull/4099)).


### Documentation

* Add a warning that attribute should be used in `databricks_permissions` for `databricks_vector_search_endpoint` ([#4312](https://github.com/databricks/terraform-provider-databricks/pull/4312)).


### Internal Changes

* Added TF Plugin Framework checkbox item to PR template and removed checkbox for integration tests passing ([#4227](https://github.com/databricks/terraform-provider-databricks/pull/4227)).
* Expose several integration test helpers for use in plugin framework integration tests ([#4310](https://github.com/databricks/terraform-provider-databricks/pull/4310)).
* Fix ReadOnly() for ListNestedAttribute and Validators for ListNestedBlock ([#4313](https://github.com/databricks/terraform-provider-databricks/pull/4313)).
* Panic if the provided path is invalid ([#4309](https://github.com/databricks/terraform-provider-databricks/pull/4309)).
* Simplify `databricks_storage_credential` code ([#4301](https://github.com/databricks/terraform-provider-databricks/pull/4301)).
* Use Attributes by default for List Objects ([#4315](https://github.com/databricks/terraform-provider-databricks/pull/4315)).
* Use Plugin Framework types internally in generated TF SDK structures ([#4291](https://github.com/databricks/terraform-provider-databricks/pull/4291)).


## [Release] Release v1.60.0

### New Features and Improvements

* Add `databricks_credential` resource ([#4219](https://github.com/databricks/terraform-provider-databricks/pull/4219)).
* Allow to filter jobs by name in `databricks_jobs` data source ([#3395](https://github.com/databricks/terraform-provider-databricks/pull/3395)).


### Bug Fixes

* Add client side validation for `volume_type` ([#4289](https://github.com/databricks/terraform-provider-databricks/pull/4289)).
* Forced send `auto_stop_mins` for `databricks_sql_endpoint` resource ([#4265](https://github.com/databricks/terraform-provider-databricks/pull/4265)).
* Handle deleted cluster gracefully ([#4280](https://github.com/databricks/terraform-provider-databricks/pull/4280)).
* Remove config drift if Azure SP is used in `databricks_credential` ([#4294](https://github.com/databricks/terraform-provider-databricks/pull/4294)).
* Use correct domain for Azure Gov and China ([#4274](https://github.com/databricks/terraform-provider-databricks/pull/4274)).
* don't start cluster if `warehouse_id` is specified for `databricks_sql_table` resource ([#4259](https://github.com/databricks/terraform-provider-databricks/pull/4259)).


### Documentation

* Document import support for `databricks_notification_destination` ([#4276](https://github.com/databricks/terraform-provider-databricks/pull/4276)).
* Update documentation for importing some MWS resources ([#4281](https://github.com/databricks/terraform-provider-databricks/pull/4281)).
* Update mws_log_delivery.md to add time_sleep ([#4258](https://github.com/databricks/terraform-provider-databricks/pull/4258)).
* Add missing H2 header in `mws_network_connectivity_configs.md` and optimization in `data_mws_network_connectivity_configs` ([#4256](https://github.com/databricks/terraform-provider-databricks/pull/4256)).


### Internal Changes

* Add ConvertToAttribute() to convert blocks in a resource/data source schema to attributes ([#4284](https://github.com/databricks/terraform-provider-databricks/pull/4284)).
* Bump Go SDK and generate TF structs ([#4300](https://github.com/databricks/terraform-provider-databricks/pull/4300)).
* Generate effective fields based of isServiceProposedIfEmpty ([#4282](https://github.com/databricks/terraform-provider-databricks/pull/4282)).
* Ignore Databricks Go SDK updates by dependabot ([#4253](https://github.com/databricks/terraform-provider-databricks/pull/4253)).
* Move TFSDK model template to universe ([#4303](https://github.com/databricks/terraform-provider-databricks/pull/4303)).
* Remove unused configuration from blocks ([#4283](https://github.com/databricks/terraform-provider-databricks/pull/4283)).
* Use isServiceProposedIfEmpty annotations for effective fields ([#4270](https://github.com/databricks/terraform-provider-databricks/pull/4270)).
* Use tf_v1 genkit mode ([#4278](https://github.com/databricks/terraform-provider-databricks/pull/4278)).


### Dependency Updates

* Bump github.com/stretchr/testify from 1.9.0 to 1.10.0 ([#4269](https://github.com/databricks/terraform-provider-databricks/pull/4269)).
* Bump github.com/zclconf/go-cty from 1.15.0 to 1.15.1 ([#4273](https://github.com/databricks/terraform-provider-databricks/pull/4273)).


### Exporter

* Fix generation of references to users for user directories ([#4297](https://github.com/databricks/terraform-provider-databricks/pull/4297)).
* better handling of online tables/vsis in listing ([#4288](https://github.com/databricks/terraform-provider-databricks/pull/4288)).


## [Release] Release v1.59.0

### New Features and Improvements
Expand Down
10 changes: 9 additions & 1 deletion CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -135,10 +135,18 @@ We are migrating the resource from SDKv2 to Plugin Framework provider and hence
6. Create a PR and send it for review.
### Migrating resource to plugin framework
Ideally there shouldn't be any behaviour change when migrating a resource or data source to either Go SDk or Plugin Framework.
There must not be any behaviour change or schema change when migrating a resource or data source to either Go SDK or Plugin Framework.
- Please make sure there are no breaking differences due to changes in schema by running: `make diff-schema`.
- Integration tests shouldn't require any major changes.
By default, `ResourceStructToSchema` will convert a `types.List` field to a `ListAttribute` or `ListNestedAttribute`. For resources or data sources migrated from the SDKv2, `ListNestedBlock` must be used for such fields. To do this, call `cs.ConfigureAsSdkV2Compatible()` in the `ResourceStructToSchema` callback:
```go
resp.Schema = tfschema.ResourceStructToSchema(ctx, Resource{}, func(c tfschema.CustomizableSchema) tfschema.CustomizableSchema {
cs.ConfigureAsSdkV2Compatible()
// Add any additional configuration here
return cs
})
```
### Code Organization
Each resource and data source should be defined in package `internal/providers/plugnifw/products/<resource>`, e.g.: `internal/providers/plugnifw/products/volume` package will contain both resource, data sources and other utils specific to volumes. Tests (both unit and integration tests) will also remain in this package.
Expand Down
15 changes: 14 additions & 1 deletion catalog/resource_credential.go
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,8 @@ import (

var credentialSchema = common.StructToSchema(catalog.CredentialInfo{},
func(m map[string]*schema.Schema) map[string]*schema.Schema {
var alofServiceCreds = []string{"aws_iam_role", "azure_managed_identity", "azure_service_principal"}
var alofServiceCreds = []string{"aws_iam_role", "azure_managed_identity", "azure_service_principal",
"databricks_gcp_service_account"}
for _, cred := range alofServiceCreds {
common.CustomizeSchemaPath(m, cred).SetExactlyOneOf(alofServiceCreds)
}
Expand All @@ -25,6 +26,10 @@ var credentialSchema = common.StructToSchema(catalog.CredentialInfo{},
common.CustomizeSchemaPath(m, computed).SetComputed()
}

common.CustomizeSchemaPath(m, "databricks_gcp_service_account").SetComputed()
common.CustomizeSchemaPath(m, "databricks_gcp_service_account", "email").SetComputed()
common.CustomizeSchemaPath(m, "databricks_gcp_service_account", "credential_id").SetComputed()
common.CustomizeSchemaPath(m, "databricks_gcp_service_account", "private_key_id").SetComputed()
common.MustSchemaPath(m, "aws_iam_role", "external_id").Computed = true
common.MustSchemaPath(m, "aws_iam_role", "unity_catalog_iam_arn").Computed = true
common.MustSchemaPath(m, "azure_managed_identity", "credential_id").Computed = true
Expand Down Expand Up @@ -94,6 +99,14 @@ func ResourceCredential() common.Resource {
if err != nil {
return err
}
// azure client secret is sensitive, so we need to preserve it
var credOrig catalog.CredentialInfo
common.DataToStructPointer(d, credentialSchema, &credOrig)
if credOrig.AzureServicePrincipal != nil {
if credOrig.AzureServicePrincipal.ClientSecret != "" {
cred.AzureServicePrincipal.ClientSecret = credOrig.AzureServicePrincipal.ClientSecret
}
}
d.Set("credential_id", cred.Id)
return common.StructToData(cred, credentialSchema, d)
},
Expand Down
7 changes: 1 addition & 6 deletions catalog/resource_metastore_data_access.go
Original file line number Diff line number Diff line change
Expand Up @@ -90,13 +90,8 @@ func ResourceMetastoreDataAccess() common.Resource {
Create: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
metastoreId := d.Get("metastore_id").(string)

tmpSchema := removeGcpSaField(dacSchema)
var create catalog.CreateStorageCredential
common.DataToStructPointer(d, tmpSchema, &create)
//manually add empty struct back for databricks_gcp_service_account
if _, ok := d.GetOk("databricks_gcp_service_account"); ok {
create.DatabricksGcpServiceAccount = &catalog.DatabricksGcpServiceAccountRequest{}
}
common.DataToStructPointer(d, dacSchema, &create)

return c.AccountOrWorkspaceRequest(func(acc *databricks.AccountClient) error {
dac, err := acc.StorageCredentials.Create(ctx,
Expand Down
22 changes: 4 additions & 18 deletions catalog/resource_storage_credential.go
Original file line number Diff line number Diff line change
Expand Up @@ -26,22 +26,14 @@ type StorageCredentialInfo struct {
IsolationMode string `json:"isolation_mode,omitempty" tf:"computed"`
}

func removeGcpSaField(originalSchema map[string]*schema.Schema) map[string]*schema.Schema {
//common.DataToStructPointer(d, s, &create) will error out because of DatabricksGcpServiceAccount any
tmpSchema := make(map[string]*schema.Schema)
for k, v := range originalSchema {
tmpSchema[k] = v
}
delete(tmpSchema, "databricks_gcp_service_account")
return tmpSchema
}

var storageCredentialSchema = common.StructToSchema(StorageCredentialInfo{},
func(m map[string]*schema.Schema) map[string]*schema.Schema {
m["storage_credential_id"] = &schema.Schema{
Type: schema.TypeString,
Computed: true,
}
common.MustSchemaPath(m, "databricks_gcp_service_account", "email").Computed = true
common.MustSchemaPath(m, "databricks_gcp_service_account", "credential_id").Computed = true
return adjustDataAccessSchema(m)
})

Expand All @@ -50,19 +42,13 @@ func ResourceStorageCredential() common.Resource {
Schema: storageCredentialSchema,
Create: func(ctx context.Context, d *schema.ResourceData, c *common.DatabricksClient) error {
metastoreId := d.Get("metastore_id").(string)
tmpSchema := removeGcpSaField(storageCredentialSchema)

var create catalog.CreateStorageCredential
var update catalog.UpdateStorageCredential
common.DataToStructPointer(d, tmpSchema, &create)
common.DataToStructPointer(d, tmpSchema, &update)
common.DataToStructPointer(d, storageCredentialSchema, &create)
common.DataToStructPointer(d, storageCredentialSchema, &update)
update.Name = d.Get("name").(string)

//manually add empty struct back for databricks_gcp_service_account
if _, ok := d.GetOk("databricks_gcp_service_account"); ok {
create.DatabricksGcpServiceAccount = &catalog.DatabricksGcpServiceAccountRequest{}
}

return c.AccountOrWorkspaceRequest(func(acc *databricks.AccountClient) error {
storageCredential, err := acc.StorageCredentials.Create(ctx,
catalog.AccountsCreateStorageCredential{
Expand Down
8 changes: 8 additions & 0 deletions catalog/resource_volume.go
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@ import (
"github.com/databricks/databricks-sdk-go/service/catalog"
"github.com/databricks/terraform-provider-databricks/common"
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/schema"
"github.com/hashicorp/terraform-plugin-sdk/v2/helper/validation"
)

// This structure contains the fields of catalog.UpdateVolumeRequestContent and catalog.CreateVolumeRequestContent
Expand Down Expand Up @@ -50,6 +51,13 @@ func ResourceVolume() common.Resource {
Type: schema.TypeString,
Computed: true,
}
// As of 3rd December 2024, the Volumes create API returns an incorrect
// error message "CreateVolume Missing required field: volume_type"
// if you specify an invalid value for volume_type (i.e. not one of "MANAGED" or "EXTERNAL").
//
// If server side validation is added in the future, this validation function
// can be removed.
common.CustomizeSchemaPath(m, "volume_type").SetValidateFunc(validation.StringInSlice([]string{"MANAGED", "EXTERNAL"}, false))
return m
})
return common.Resource{
Expand Down
Loading

0 comments on commit 7672d40

Please sign in to comment.