Skip to content

Commit

Permalink
feat: Add Support for snowflake destination
Browse files Browse the repository at this point in the history
  • Loading branch information
fdmsantos committed Jul 25, 2024
1 parent d02fa9a commit f98fad5
Show file tree
Hide file tree
Showing 11 changed files with 352 additions and 17 deletions.
2 changes: 1 addition & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ All notable changes to this project will be documented in this file.

### Features

* Prevent perpetual differences during the terraform plan/apply ([#9](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/issues/9)) ([0f72a49](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/commit/0f72a49624a464cbbb004b18e02efd56d07b175b))
* Prevent perpetual differences during the terraform plan/apply ([#14](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/issues/9)) ([0f72a49](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/commit/0f72a49624a464cbbb004b18e02efd56d07b175b))

## [3.3.0](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/compare/v3.2.0...v3.3.0) (2024-03-05)

Expand Down
59 changes: 50 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ Supports all destinations and all Kinesis Firehose Features.
* [Opensearch](#opensearch)
* [Opensearch Serverless](#opensearch-serverless)
* [Splunk](#splunk)
* [Snowflake](#snowflake)
* [HTTP Endpoint](#http-endpoint)
* [Datadog](#datadog)
* [New Relic](#new-relic)
Expand Down Expand Up @@ -77,11 +78,13 @@ Supports all destinations and all Kinesis Firehose Features.
- Data Format Conversion
- Dynamic Partition
- Redshift
- VPC Support. Security Groups creation supported
- VPC Support. Security Groups creation supported.
- ElasticSearch / Opensearch / Opensearch Serverless
- VPC Support. Security Groups creation supported
- VPC Support. Security Groups creation supported.
- Splunk
- VPC Support. Security Groups creation supported
- VPC Support. Security Groups creation supported.
- Snowflake
- VPCE Support.
- Custom Http Endpoint
- DataDog
- Coralogix
Expand Down Expand Up @@ -295,6 +298,28 @@ module "firehose" {
}
```

#### Snowflake

**To Enabled It:** `destination = "snowflake"`

**Variables Prefix:** `snowflake_`

```hcl
module "firehose" {
source = "fdmsantos/kinesis-firehose/aws"
version = "x.x.x"
name = "firehose-delivery-stream"
destination = "snowflake"
snowflake_account_identifier = "<snowflake_account_identifier>"
snowflake_private_key = "<snowflake_private_key>"
snowflake_key_passphrase = "<snowflake_key_passphrase>"
snowflake_user = "<snowflake_user>"
snowflake_database = "<snowflake_database>"
snowflake_schema = "<snowflake_schema>"
snowflake_table = "<snowflake_table>"
}
```

#### HTTP Endpoint

**To Enabled It:** `destination = "http_endpoint"`
Expand Down Expand Up @@ -789,9 +814,10 @@ The destination variable configured in module is mapped to firehose valid destin
| s3 and extended_s3 | extended_s3 | There is no difference between s3 or extended_s3 destinations |
| redshift | redshift | |
| splunk | splunk | |
| opensearch | elasticsearch | |
| elasticsearch | elasticsearch | |
| opensearch | opensearch | |
| opensearchserverless | opensearchserverless | |
| snowflake | snowflake | |
| http_endpoint | http_endpoint | |
| datadog | http_endpoint | The difference regarding http_endpoint is the http_endpoint_url and http_endpoint_name variables aren't support, and it's necessary configure datadog_endpoint_type variable |
| newrelic | http_endpoint | The difference regarding http_endpoint is the http_endpoint_url and http_endpoint_name variables aren't support, and it's necessary configure newrelic_endpoint_type variable |
Expand All @@ -817,6 +843,7 @@ The destination variable configured in module is mapped to firehose valid destin
- [Opensearch Serverless In Vpc](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/opensearch/direct-put-to-opensearchserverless-in-vpc) - Creates a Kinesis Firehose Stream with serverless opensearch in VPC as destination.
- [Public Splunk](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/splunk/public-splunk) - Creates a Kinesis Firehose Stream with public splunk as destination.
- [Splunk In VPC](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/splunk/splunk-in-vpc) - Creates a Kinesis Firehose Stream with splunk in VPC as destination.
- [Snowflake](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/snowflake/direct-put-to-snowflake) - Creates a Kinesis Firehose Stream with snowflake as destination.
- [Custom Http Endpoint](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/custom-http-endpoint) - Creates a Kinesis Firehose Stream with custom http endpoint as destination.
- [Datadog](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/datadog) - Creates a Kinesis Firehose Stream with datadog europe metrics as destination.
- [New Relic](https://github.com/fdmsantos/terraform-aws-kinesis-firehose/tree/main/examples/http-endpoint/newrelic) - Creates a Kinesis Firehose Stream with New Relic europe metrics as destination.
Expand All @@ -833,13 +860,13 @@ The destination variable configured in module is mapped to firehose valid destin
| Name | Version |
|------|---------|
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 0.13.1 |
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | >= 5.33 |
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | >= 5.47 |

## Providers

| Name | Version |
|------|---------|
| <a name="provider_aws"></a> [aws](#provider\_aws) | >= 5.33 |
| <a name="provider_aws"></a> [aws](#provider\_aws) | >= 5.47 |

## Modules

Expand Down Expand Up @@ -927,9 +954,9 @@ No modules.
| <a name="input_buffering_size"></a> [buffering\_size](#input\_buffering\_size) | Buffer incoming data to the specified size, in MBs, before delivering it to the destination. | `number` | `5` | no |
| <a name="input_configure_existing_application_role"></a> [configure\_existing\_application\_role](#input\_configure\_existing\_application\_role) | Set it to True if want use existing application role to add the firehose Policy | `bool` | `false` | no |
| <a name="input_coralogix_endpoint_location"></a> [coralogix\_endpoint\_location](#input\_coralogix\_endpoint\_location) | Endpoint Location to coralogix destination | `string` | `"ireland"` | no |
| <a name="input_coralogix_parameter_application_name"></a> [coralogix\_parameter\_application\_name](#input\_coralogix\_parameter\_application\_name) | By default, your delivery stream arn will be used as applicationName | `string` | `null` | no |
| <a name="input_coralogix_parameter_subsystem_name"></a> [coralogix\_parameter\_subsystem\_name](#input\_coralogix\_parameter\_subsystem\_name) | By default, your delivery stream name will be used as subsystemName | `string` | `null` | no |
| <a name="input_coralogix_parameter_use_dynamic_values"></a> [coralogix\_parameter\_use\_dynamic\_values](#input\_coralogix\_parameter\_use\_dynamic\_values) | To use dynamic values for applicationName and subsystemName | `bool` | `false` | no |
| <a name="input_coralogix_parameter_application_name"></a> [coralogix\_parameter\_application\_name](#input\_coralogix\_parameter\_application\_name) | By default, your delivery stream arn will be used as applicationName. | `string` | `null` | no |
| <a name="input_coralogix_parameter_subsystem_name"></a> [coralogix\_parameter\_subsystem\_name](#input\_coralogix\_parameter\_subsystem\_name) | By default, your delivery stream name will be used as subsystemName. | `string` | `null` | no |
| <a name="input_coralogix_parameter_use_dynamic_values"></a> [coralogix\_parameter\_use\_dynamic\_values](#input\_coralogix\_parameter\_use\_dynamic\_values) | To use dynamic values for applicationName and subsystemName. | `bool` | `false` | no |
| <a name="input_create"></a> [create](#input\_create) | Controls if kinesis firehose should be created (it affects almost all resources) | `bool` | `true` | no |
| <a name="input_create_application_role"></a> [create\_application\_role](#input\_create\_application\_role) | Set it to true to create role to be used by the source | `bool` | `false` | no |
| <a name="input_create_application_role_policy"></a> [create\_application\_role\_policy](#input\_create\_application\_role\_policy) | Set it to true to create policy to the role used by the source | `bool` | `false` | no |
Expand Down Expand Up @@ -1065,6 +1092,20 @@ No modules.
| <a name="input_s3_kms_key_arn"></a> [s3\_kms\_key\_arn](#input\_s3\_kms\_key\_arn) | Specifies the KMS key ARN the stream will use to encrypt data. If not set, no encryption will be used | `string` | `null` | no |
| <a name="input_s3_own_bucket"></a> [s3\_own\_bucket](#input\_s3\_own\_bucket) | Indicates if you own the bucket. If not, will be configure permissions to grants the bucket owner full access to the objects delivered by Kinesis Data Firehose | `bool` | `true` | no |
| <a name="input_s3_prefix"></a> [s3\_prefix](#input\_s3\_prefix) | The YYYY/MM/DD/HH time format prefix is automatically used for delivered S3 files. You can specify an extra prefix to be added in front of the time format prefix. Note that if the prefix ends with a slash, it appears as a folder in the S3 bucket | `string` | `null` | no |
| <a name="input_snowflake_account_identifier"></a> [snowflake\_account\_identifier](#input\_snowflake\_account\_identifier) | The Snowflake account identifier. | `string` | `null` | no |
| <a name="input_snowflake_content_column_name"></a> [snowflake\_content\_column\_name](#input\_snowflake\_content\_column\_name) | The name of the content column. | `string` | `null` | no |
| <a name="input_snowflake_data_loading_option"></a> [snowflake\_data\_loading\_option](#input\_snowflake\_data\_loading\_option) | The data loading option. | `string` | `null` | no |
| <a name="input_snowflake_database"></a> [snowflake\_database](#input\_snowflake\_database) | The Snowflake database name. | `string` | `null` | no |
| <a name="input_snowflake_key_passphrase"></a> [snowflake\_key\_passphrase](#input\_snowflake\_key\_passphrase) | The Snowflake passphrase for the private key. | `string` | `null` | no |
| <a name="input_snowflake_metadata_column_name"></a> [snowflake\_metadata\_column\_name](#input\_snowflake\_metadata\_column\_name) | The name of the metadata column. | `string` | `null` | no |
| <a name="input_snowflake_private_key"></a> [snowflake\_private\_key](#input\_snowflake\_private\_key) | The Snowflake private key for authentication. | `string` | `null` | no |
| <a name="input_snowflake_private_link_vpce_id"></a> [snowflake\_private\_link\_vpce\_id](#input\_snowflake\_private\_link\_vpce\_id) | The VPCE ID for Firehose to privately connect with Snowflake. | `string` | `null` | no |
| <a name="input_snowflake_retry_duration"></a> [snowflake\_retry\_duration](#input\_snowflake\_retry\_duration) | The length of time during which Firehose retries delivery after a failure, starting from the initial request and including the first attempt. | `string` | `60` | no |
| <a name="input_snowflake_role_configuration_enabled"></a> [snowflake\_role\_configuration\_enabled](#input\_snowflake\_role\_configuration\_enabled) | Whether the Snowflake role is enabled. | `bool` | `false` | no |
| <a name="input_snowflake_role_configuration_role"></a> [snowflake\_role\_configuration\_role](#input\_snowflake\_role\_configuration\_role) | The Snowflake role. | `string` | `null` | no |
| <a name="input_snowflake_schema"></a> [snowflake\_schema](#input\_snowflake\_schema) | The Snowflake schema name. | `string` | `null` | no |
| <a name="input_snowflake_table"></a> [snowflake\_table](#input\_snowflake\_table) | The Snowflake table name. | `string` | `null` | no |
| <a name="input_snowflake_user"></a> [snowflake\_user](#input\_snowflake\_user) | The user for authentication.. | `string` | `null` | no |
| <a name="input_source_role_arn"></a> [source\_role\_arn](#input\_source\_role\_arn) | The ARN of the role that provides access to the source. Only Supported on Kinesis and MSK Sources | `string` | `null` | no |
| <a name="input_source_use_existing_role"></a> [source\_use\_existing\_role](#input\_source\_use\_existing\_role) | Indicates if want use the kinesis firehose role for sources access. Only Supported on Kinesis and MSK Sources | `bool` | `true` | no |
| <a name="input_splunk_hec_acknowledgment_timeout"></a> [splunk\_hec\_acknowledgment\_timeout](#input\_splunk\_hec\_acknowledgment\_timeout) | The amount of time, that Kinesis Firehose waits to receive an acknowledgment from Splunk after it sends it data | `number` | `600` | no |
Expand Down
60 changes: 60 additions & 0 deletions examples/snowflake/direct-put-to-snowflake/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Snowflake

Configuration in this directory creates kinesis firehose stream with Direct Put as source and Snowflake as destination.

This example can be tested with Demo Data in Kinesis Firehose Console.

## Usage

To run this example you need to execute:

```bash
$ terraform init
$ terraform plan
$ terraform apply
```

Note that this example may create resources which cost money. Run `terraform destroy` when you don't need these resources.

<!-- BEGINNING OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
## Requirements

| Name | Version |
|------|---------|
| <a name="requirement_terraform"></a> [terraform](#requirement\_terraform) | >= 0.13.1 |
| <a name="requirement_aws"></a> [aws](#requirement\_aws) | ~> 5.0 |
| <a name="requirement_random"></a> [random](#requirement\_random) | >= 2.0 |

## Providers

| Name | Version |
|------|---------|
| <a name="provider_aws"></a> [aws](#provider\_aws) | ~> 5.0 |
| <a name="provider_random"></a> [random](#provider\_random) | >= 2.0 |

## Modules

| Name | Source | Version |
|------|--------|---------|
| <a name="module_firehose"></a> [firehose](#module\_firehose) | ../../../ | n/a |

## Resources

| Name | Type |
|------|------|
| [aws_kms_key.this](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/kms_key) | resource |
| [aws_s3_bucket.s3](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/s3_bucket) | resource |
| [random_pet.this](https://registry.terraform.io/providers/hashicorp/random/latest/docs/resources/pet) | resource |

## Inputs

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_name_prefix"></a> [name\_prefix](#input\_name\_prefix) | Name prefix to use in resources | `string` | `"firehose-to-snowflake"` | no |

## Outputs

| Name | Description |
|------|-------------|
| <a name="output_firehose_role"></a> [firehose\_role](#output\_firehose\_role) | Firehose Role |
<!-- END OF PRE-COMMIT-TERRAFORM DOCS HOOK -->
38 changes: 38 additions & 0 deletions examples/snowflake/direct-put-to-snowflake/main.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
resource "random_pet" "this" {
length = 2
}

resource "aws_s3_bucket" "s3" {
bucket = "${var.name_prefix}-dest-bucket-${random_pet.this.id}"
force_destroy = true
}

resource "aws_kms_key" "this" {
description = "${var.name_prefix}-kms-key"
deletion_window_in_days = 7
}

module "firehose" {
source = "../../../"
name = "${var.name_prefix}-delivery-stream"
destination = "snowflake"
snowflake_account_identifier = "demo"
snowflake_private_key = "MIIEvgIBADANBgkqhkiG9w0BAQEFAASCBKgwggSkAgEAAoIBAQDIQEU9NvCE4EyK0QZtFBLYWX6KAnNmel4zHsNJ4WEjNzY/YEASrJ9YtHjxItVig4kQqumn8FWkbPoKLUYUqIq9UBIvtjzlsTgMJ7GznShsm0M/n1Bszqmxwm1AwFAPfH21h2MNIzXkHitg/2BN3bkTGctmySFOwNRRervo5HIUtr4qqYZwVYDQtT8+NVL1Tchvgkv4kOuQmDXpmc7iRPx0WQZU1dyPzJ9Vg5sN2nJPZwfRTL0dJfoOVOjJQTZSAEvNw3d05ez0aKBMWYM97ZFc6IJzaSEx19RYjPnluYWlpkUp309cIUlGQHGmVSxPpaoOrI5cfHTdudCzYmQiRxebAgMBAAECggEAA+/5zIx/8Pav2plXqyu50SI1WSHlwm4iFM/LbRsu30WrJQFPwvx10kyFPrOoXBxbNoYvkPQqagmiShYozhn1nGenehyTfEqztV15xi0rnyTXgNRcC2pRhGrCbGcCvcM2DAHewlRQoTsh9uM7ByQIbp798QYqnTbbTsPw+kLt34jpz3eJjFMqB+uVtLuFDA4PZi1Nq/EJhWyuwi3taW2dKn3gx0DD69yxq5lS8USV/XQ0BrF4bbcmQoEJuKnt16hMGl9PMDkqX9DPnrxBR6a9BDMaOw/r4kyOXQBgPIRr59UfN13E1Lj35rXnK6C0TcA98pichFFYjiUvR5ss+Ob7KQKBgQD9ZHqGlI+s826ov4XTbnhcxUGBFX1NHoU9zE28w6bs2++0Bim+jMgwIdJmG7ziOsV9PpvP5Zq3N2tPnkSEA+q8N+BtfBOf60kjpe4eoaYOZiGFpqGPmAW9p+b+gWsOxyUQ2HA9FdUCwEnnWx1gIdJ5BFo4YIEdJWx3Uybw1fxtswKBgQDKT8yizQ0y0dzaCyxNeIIzYpg8Cx77tvV0EFBhDQIt/fEZOIBruBZUaZYaZReEv7VHd6bIGKASDFOx7XtdhlVbfa2p5o/7rPYlAhgsCwfW94ESYJw0X3KTlS9ulSseF+bmPBHIIXPfjARcJIDi4TKv60vbW1Wdxcv08uvFTvjKeQKBgDDeEt8ngXnqTJoQrZ9z+5Rwmkxpt4uK6klbwFY6KVQeqmC+m4hbIDRgIXJ9wPSkPvgDfgsfDbJt5q0pKa+IDdoUsJyMxEAgIS/VzVFs/Vhji+15kEjgGaNU4TCOBvaHo3dXNnYhYr4wFVCf+s9SVoPuOfQLcHsNf5iXmbfynMcPAoGBAKeZPBmSbWCwYplvsB/tuU8AWsVDIUO96dFgwnXj5O5c9SLDn/+c3ULIxcTQAo/CkVbHVK9nVxQciilYZ16vLn9AumGJ07XXL4KxHX0/FhuLpq2mw0DP4YdJi6W8hZ/EhVAuazy0Gd4TjHkY9Hz/upHqB0mNfHvbpH8jzxYBujFhAoGBAMn0LHHuaajivswiK9QpM95qv2tk1wC7spZQXh2Ky4TYcTo3S83datye7Uk85NKYt4790anaGjegA6cTbuky8FgnGm1+iqVhyGxfUMPwREgWOZ3km0DeQGHxApYHiVx2xD6oZzTVpgxM7S6pCX2YxxWQolq7mIfOg5h6U6b5GmiT"
snowflake_user = "user"
snowflake_database = "database"
snowflake_schema = "schema"
snowflake_table = "table"
snowflake_data_loading_option = "VARIANT_CONTENT_AND_METADATA_MAPPING"
snowflake_metadata_column_name = "test"
snowflake_content_column_name = "test"
snowflake_role_configuration_enabled = true
snowflake_role_configuration_role = "snowflake_role"
s3_backup_mode = "FailedOnly"
s3_backup_prefix = "backup/"
s3_backup_bucket_arn = aws_s3_bucket.s3.arn
s3_backup_buffering_interval = 100
s3_backup_buffering_size = 100
s3_backup_compression = "GZIP"
s3_backup_enable_encryption = true
s3_backup_kms_key_arn = aws_kms_key.this.arn
}
4 changes: 4 additions & 0 deletions examples/snowflake/direct-put-to-snowflake/outputs.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
output "firehose_role" {
description = "Firehose Role"
value = module.firehose.kinesis_firehose_role_arn
}
5 changes: 5 additions & 0 deletions examples/snowflake/direct-put-to-snowflake/variables.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
variable "name_prefix" {
description = "Name prefix to use in resources"
type = string
default = "firehose-to-snowflake"
}
14 changes: 14 additions & 0 deletions examples/snowflake/direct-put-to-snowflake/versions.tf
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
terraform {
required_version = ">= 0.13.1"

required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
random = {
source = "hashicorp/random"
version = ">= 2.0"
}
}
}
Loading

0 comments on commit f98fad5

Please sign in to comment.