Skip to content

Commit

Permalink
data sources
Browse files Browse the repository at this point in the history
  • Loading branch information
mgyucht committed Dec 12, 2024
1 parent e0dd84d commit 9114ea1
Show file tree
Hide file tree
Showing 10 changed files with 419 additions and 55 deletions.
80 changes: 80 additions & 0 deletions docs/data-sources/app.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,80 @@
---
subcategory: "Apps"
---
# databricks_app Data Source

-> This feature is in [Public Preview](https://docs.databricks.com/release-notes/release-types.html).

[Databricks Apps](https://docs.databricks.com/en/dev-tools/databricks-apps/index.html) run directly on a customer’s Databricks instance, integrate with their data, use and extend Databricks services, and enable users to interact through single sign-on. This resource creates the application but does not handle app deployment, which should be handled separately as part of your CI/CD pipeline.

This data source allows you to fetch information about a Databricks App.

## Example Usage

```hcl
data "databricks_app" "this" {
name = "my-custom-app"
}
```

## Argument Reference

The following arguments are required:

* `name` - The name of the app.

## Attribute Reference

In addition to all arguments above, the following attributes are exported:

* `app` attribute
* `name` - The name of the app.
* `description` - The description of the app.
* `resources` - A list of resources that the app have access to.
* `compute_status` attribute
* `state` - State of the app compute.
* `message` - Compute status message
* `app_status` attribute
* `state` - State of the application.
* `message` - Application status message
* `url` - The URL of the app once it is deployed.
* `create_time` - The creation time of the app.
* `creator` - The email of the user that created the app.
* `update_time` - The update time of the app.
* `updater` - The email of the user that last updated the app.
* `service_principal_id` - id of the app service principal
* `service_principal_name` - name of the app service principal
* `default_source_code_path` - The default workspace file system path of the source code from which app deployment are created. This field tracks the workspace source code path of the last active deployment.

### resources Attribute

This attribute describes a resource used by the app.

* `name` - The name of the resource.
* `description` - The description of the resource.

Exactly one of the following attributes will be provided:

* `secret` attribute
* `scope` - Scope of the secret to grant permission on.
* `key` - Key of the secret to grant permission on.
* `permission` - Permission to grant on the secret scope. For secrets, only one permission is allowed. Permission must be one of: `READ`, `WRITE`, `MANAGE`.
* `sql_warehouse` attribute
* `id` - Id of the SQL warehouse to grant permission on.
* `permission` - Permission to grant on the SQL warehouse. Supported permissions are: `CAN_MANAGE`, `CAN_USE`, `IS_OWNER`.
* `serving_endpoint` attribute
* `name` - Name of the serving endpoint to grant permission on.
* `permission` - Permission to grant on the serving endpoint. Supported permissions are: `CAN_MANAGE`, `CAN_QUERY`, `CAN_VIEW`.
* `job` attribute
* `id` - Id of the job to grant permission on.
* `permission` - Permissions to grant on the Job. Supported permissions are: `CAN_MANAGE`, `IS_OWNER`, `CAN_MANAGE_RUN`, `CAN_VIEW`.

## Related Resources

The following resources are used in the same context:

* [databricks_app](../resources/app.md) to manage [Databricks Apps](https://docs.databricks.com/en/dev-tools/databricks-apps/index.html).
* [databricks_sql_endpoint](sql_endpoint.md) to manage Databricks SQL [Endpoints](https://docs.databricks.com/sql/admin/sql-endpoints.html).
* [databricks_model_serving](model_serving.md) to serve this model on a Databricks serving endpoint.
* [databricks_secret](secret.md) to manage [secrets](https://docs.databricks.com/security/secrets/index.html#secrets-user-guide) in Databricks workspace.
* [databricks_job](job.md) to manage [Databricks Jobs](https://docs.databricks.com/jobs.html) to run non-interactive code.
72 changes: 72 additions & 0 deletions docs/data-sources/apps.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
---
subcategory: "Apps"
---
# databricks_apps Data Source

-> This feature is in [Public Preview](https://docs.databricks.com/release-notes/release-types.html).

[Databricks Apps](https://docs.databricks.com/en/dev-tools/databricks-apps/index.html) run directly on a customer’s Databricks instance, integrate with their data, use and extend Databricks services, and enable users to interact through single sign-on. This resource creates the application but does not handle app deployment, which should be handled separately as part of your CI/CD pipeline.

This data source allows you to fetch information about all Databricks Apps within a workspace.

## Example Usage

```hcl
data "databricks_apps" "all_apps" {}
```

## Attribute Reference

The following attributes are exported:

* `apps` - A list of [databricks_app](../resources/app.md) resources.
* `name` - The name of the app.
* `description` - The description of the app.
* `resources` - A list of resources that the app have access to.
* `compute_status` attribute
* `state` - State of the app compute.
* `message` - Compute status message
* `app_status` attribute
* `state` - State of the application.
* `message` - Application status message
* `url` - The URL of the app once it is deployed.
* `create_time` - The creation time of the app.
* `creator` - The email of the user that created the app.
* `update_time` - The update time of the app.
* `updater` - The email of the user that last updated the app.
* `service_principal_id` - id of the app service principal
* `service_principal_name` - name of the app service principal
* `default_source_code_path` - The default workspace file system path of the source code from which app deployment are created. This field tracks the workspace source code path of the last active deployment.

### resources Attribute

This attribute describes a resource used by the app.

* `name` - The name of the resource.
* `description` - The description of the resource.

Exactly one of the following attributes will be provided:

* `secret` attribute
* `scope` - Scope of the secret to grant permission on.
* `key` - Key of the secret to grant permission on.
* `permission` - Permission to grant on the secret scope. For secrets, only one permission is allowed. Permission must be one of: `READ`, `WRITE`, `MANAGE`.
* `sql_warehouse` attribute
* `id` - Id of the SQL warehouse to grant permission on.
* `permission` - Permission to grant on the SQL warehouse. Supported permissions are: `CAN_MANAGE`, `CAN_USE`, `IS_OWNER`.
* `serving_endpoint` attribute
* `name` - Name of the serving endpoint to grant permission on.
* `permission` - Permission to grant on the serving endpoint. Supported permissions are: `CAN_MANAGE`, `CAN_QUERY`, `CAN_VIEW`.
* `job` attribute
* `id` - Id of the job to grant permission on.
* `permission` - Permissions to grant on the Job. Supported permissions are: `CAN_MANAGE`, `IS_OWNER`, `CAN_MANAGE_RUN`, `CAN_VIEW`.

## Related Resources

The following resources are used in the same context:

* [databricks_app](../resources/app.md) to manage [Databricks Apps](https://docs.databricks.com/en/dev-tools/databricks-apps/index.html).
* [databricks_sql_endpoint](sql_endpoint.md) to manage Databricks SQL [Endpoints](https://docs.databricks.com/sql/admin/sql-endpoints.html).
* [databricks_model_serving](model_serving.md) to serve this model on a Databricks serving endpoint.
* [databricks_secret](secret.md) to manage [secrets](https://docs.databricks.com/security/secrets/index.html#secrets-user-guide) in Databricks workspace.
* [databricks_job](job.md) to manage [Databricks Jobs](https://docs.databricks.com/jobs.html) to run non-interactive code.
4 changes: 2 additions & 2 deletions docs/resources/app.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,12 +47,12 @@ The following arguments are required:

### resources Configuration Attribute

This block describes individual resource.
This attribute describes a resource used by the app.

* `name` - (Required) The name of the resource.
* `description` - (Optional) The description of the resource.

Exactly one of the specific blocks described below is required:
Exactly one of the following attributes must be provided:

* `secret` attribute
* `scope` - Scope of the secret to grant permission on.
Expand Down
2 changes: 2 additions & 0 deletions internal/providers/pluginfw/pluginfw_rollout_utils.go
Original file line number Diff line number Diff line change
Expand Up @@ -49,6 +49,8 @@ var pluginFwOnlyResources = []func() resource.Resource{
// List of data sources that have been onboarded to the plugin framework - not migrated from sdkv2.
// Keep this list sorted.
var pluginFwOnlyDataSources = []func() datasource.DataSource{
app.DataSourceApp,
app.DataSourceApps,
catalog.DataSourceFunctions,
notificationdestinations.DataSourceNotificationDestinations,
registered_model.DataSourceRegisteredModel,
Expand Down
85 changes: 85 additions & 0 deletions internal/providers/pluginfw/products/app/data_app.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
package app

import (
"context"
"reflect"

"github.com/databricks/terraform-provider-databricks/common"
pluginfwcommon "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/common"
pluginfwcontext "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/context"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/tfschema"
"github.com/databricks/terraform-provider-databricks/internal/service/apps_tf"
"github.com/hashicorp/terraform-plugin-framework/datasource"
"github.com/hashicorp/terraform-plugin-framework/path"
"github.com/hashicorp/terraform-plugin-framework/types"
)

func DataSourceApp() datasource.DataSource {
return &dataSourceApp{}
}

type dataSourceApp struct {
client *common.DatabricksClient
}

type dataApp struct {
Name types.String `tfsdk:"name"`
App types.Object `tfsdk:"app" tf:"computed"`
}

func (dataApp) GetComplexFieldTypes(context.Context) map[string]reflect.Type {
return map[string]reflect.Type{
"app": reflect.TypeOf(apps_tf.App{}),
}
}

func (a dataSourceApp) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(resourceName)
}

func (a dataSourceApp) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {
resp.Schema = tfschema.DataSourceStructToSchema(ctx, dataApp{}, func(cs tfschema.CustomizableSchema) tfschema.CustomizableSchema {
return cs
})
}

func (a *dataSourceApp) Configure(ctx context.Context, req datasource.ConfigureRequest, resp *datasource.ConfigureResponse) {
if a.client == nil && req.ProviderData != nil {
a.client = pluginfwcommon.ConfigureDataSource(req, resp)
}
}

func (a *dataSourceApp) Read(ctx context.Context, req datasource.ReadRequest, resp *datasource.ReadResponse) {
ctx = pluginfwcontext.SetUserAgentInDataSourceContext(ctx, resourceName)
w, diags := a.client.GetWorkspaceClient()
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}

var name types.String
resp.Diagnostics.Append(req.Config.GetAttribute(ctx, path.Root("name"), &name)...)
if resp.Diagnostics.HasError() {
return
}

appGoSdk, err := w.Apps.GetByName(ctx, name.ValueString())
if err != nil {
resp.Diagnostics.AddError("failed to read app", err.Error())
return
}

var newApp apps_tf.App
resp.Diagnostics.Append(converters.GoSdkToTfSdkStruct(ctx, appGoSdk, &newApp)...)
if resp.Diagnostics.HasError() {
return
}
dataApp := dataApp{Name: name, App: newApp.ToObjectValue(ctx)}
resp.Diagnostics.Append(resp.State.Set(ctx, dataApp)...)
if resp.Diagnostics.HasError() {
return
}
}

var _ datasource.DataSourceWithConfigure = &dataSourceApp{}
21 changes: 21 additions & 0 deletions internal/providers/pluginfw/products/app/data_app_acc_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
package app_test

import (
"testing"

"github.com/databricks/terraform-provider-databricks/internal/acceptance"
)

func TestAccAppDataSource(t *testing.T) {
acceptance.LoadWorkspaceEnv(t)
if acceptance.IsGcp(t) {
acceptance.Skipf(t)("not available on GCP")
}
acceptance.WorkspaceLevel(t, acceptance.Step{
Template: makeTemplate("My app") + `
data "databricks_app" "this" {
name = databricks_app.this.name
}
`,
})
}
83 changes: 83 additions & 0 deletions internal/providers/pluginfw/products/app/data_apps.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,83 @@
package app

import (
"context"
"reflect"

"github.com/databricks/databricks-sdk-go/service/apps"
"github.com/databricks/terraform-provider-databricks/common"
pluginfwcommon "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/common"
pluginfwcontext "github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/context"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/converters"
"github.com/databricks/terraform-provider-databricks/internal/providers/pluginfw/tfschema"
"github.com/databricks/terraform-provider-databricks/internal/service/apps_tf"
"github.com/hashicorp/terraform-plugin-framework/attr"
"github.com/hashicorp/terraform-plugin-framework/datasource"
"github.com/hashicorp/terraform-plugin-framework/types"
)

func DataSourceApps() datasource.DataSource {
return &dataSourceApps{}
}

type dataSourceApps struct {
client *common.DatabricksClient
}

type dataApps struct {
Apps types.List `tfsdk:"app" tf:"computed"`
}

func (dataApps) GetComplexFieldTypes(context.Context) map[string]reflect.Type {
return map[string]reflect.Type{
"app": reflect.TypeOf(apps_tf.App{}),
}
}

func (a dataSourceApps) Metadata(ctx context.Context, req datasource.MetadataRequest, resp *datasource.MetadataResponse) {
resp.TypeName = pluginfwcommon.GetDatabricksProductionName(resourceNamePlural)
}

func (a dataSourceApps) Schema(ctx context.Context, req datasource.SchemaRequest, resp *datasource.SchemaResponse) {
resp.Schema = tfschema.DataSourceStructToSchema(ctx, dataApps{}, func(cs tfschema.CustomizableSchema) tfschema.CustomizableSchema {
return cs
})
}

func (a *dataSourceApps) Configure(ctx context.Context, req datasource.ConfigureRequest, resp *datasource.ConfigureResponse) {
if a.client == nil && req.ProviderData != nil {
a.client = pluginfwcommon.ConfigureDataSource(req, resp)
}
}

func (a *dataSourceApps) Read(ctx context.Context, req datasource.ReadRequest, resp *datasource.ReadResponse) {
ctx = pluginfwcontext.SetUserAgentInDataSourceContext(ctx, resourceName)
w, diags := a.client.GetWorkspaceClient()
resp.Diagnostics.Append(diags...)
if resp.Diagnostics.HasError() {
return
}

appsGoSdk, err := w.Apps.ListAll(ctx, apps.ListAppsRequest{})
if err != nil {
resp.Diagnostics.AddError("failed to read app", err.Error())
return
}

apps := []attr.Value{}
for _, appGoSdk := range appsGoSdk {
app := apps_tf.App{}
resp.Diagnostics.Append(converters.GoSdkToTfSdkStruct(ctx, appGoSdk, &app)...)
if resp.Diagnostics.HasError() {
return
}
apps = append(apps, app.ToObjectValue(ctx))
}
dataApps := dataApps{Apps: types.ListValueMust(apps_tf.App{}.Type(ctx), apps)}
resp.Diagnostics.Append(resp.State.Set(ctx, dataApps)...)
if resp.Diagnostics.HasError() {
return
}
}

var _ datasource.DataSourceWithConfigure = &dataSourceApp{}
19 changes: 19 additions & 0 deletions internal/providers/pluginfw/products/app/data_apps_acc_test.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
package app_test

import (
"testing"

"github.com/databricks/terraform-provider-databricks/internal/acceptance"
)

func TestAccAppsDataSource(t *testing.T) {
acceptance.LoadWorkspaceEnv(t)
if acceptance.IsGcp(t) {
acceptance.Skipf(t)("not available on GCP")
}
acceptance.WorkspaceLevel(t, acceptance.Step{
Template: `
data "databricks_apps" "this" { }
`,
})
}
3 changes: 2 additions & 1 deletion internal/providers/pluginfw/products/app/resource_app.go
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,8 @@ import (
)

const (
resourceName = "app"
resourceName = "app"
resourceNamePlural = "apps"
)

func ResourceApp() resource.Resource {
Expand Down
Loading

0 comments on commit 9114ea1

Please sign in to comment.