Skip to content
This repository has been archived by the owner on Jul 13, 2020. It is now read-only.

Commit

Permalink
Fix readme
Browse files Browse the repository at this point in the history
  • Loading branch information
iuriaranda committed Jul 3, 2019
1 parent 18f390d commit 3047b26
Showing 1 changed file with 13 additions and 1 deletion.
14 changes: 13 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# Terraform Vault

[![ci.skyscrape.rs](https://ci.skyscrape.rs/api/v1/teams/skyscrapers/pipelines/terraform-modules/jobs/test-terraform-vault-master/badge)](https://ci.skyscrape.rs/teams/skyscrapers/pipelines/terraform-modules?groups=terraform-vault)
[![ci.skyscrape.rs](https://ci.skyscrape.rs/api/v1/teams/skyscrapers/pipelines/terraform-modules/jobs/terraform-vault-test-master/badge)](https://ci.skyscrape.rs/teams/skyscrapers/pipelines/terraform-modules?groups=terraform-vault)

Terraform module(s) to setup Vault on AWS

Expand Down Expand Up @@ -161,18 +161,24 @@ Follow this process only if you don't have cross-region replication enabled. If
1. Restore your data to a new DynamoDB table.
1. Set `dynamodb_table_name_override` variable to the name of your new DynamoDB table
1. Remove the old DynamoDB table from the Terraform state (make sure to replace the name of the vault module you've used - `module.vault`)

```bash
terraform state rm module.vault.module.main_dynamodb_table.aws_dynamodb_table.vault_dynamodb_table
```

1. Import the new DynamoDB table to the Terraform state (make sure to replace the name of the vault module and the name of your new DynamoDB table)

```bash
terraform import module.vault.module.main_dynamodb_table.aws_dynamodb_table.vault_dynamodb_table yournewdynamotable
```

1. Taint the Vault instances so they are replaced with the new DynamoDB table name. You have to do this, as Terraform ignores any changes on the instances user-data, which is where the DynamoDB table name is set. **Note that doing this will force a replace of those instances in your next terraform apply**

```bash
terraform taint -module ha_vault.vault1 aws_instance.instance
terraform taint -module ha_vault.vault2 aws_instance.instance
```

1. Apply terraform. This should make the following changes:
- Modify the IAM policy to grant access to the new DynamoDB table
- Replace both Vault instances
Expand All @@ -189,21 +195,27 @@ In case you also have cross-region replication (Global tables) enabled, the proc
1. Once the restore is finished, export the data from your new DynamoDB table to S3. You can do this with Data Pipeline following [this guide from AWS](https://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-importexport-ddb-part2.html).
1. In next steps we're going to create a new DynamoDB table, so choose a name for it and set it in the `dynamodb_table_name_override` variable. Remember that DynamoDB table names must be unique. An alternative would be to remove the existing DynamoDB table and reuse the same name for the new table, but that way you would loose all the incremental backups from your old table. If you choose to do that, just remove the table from the AWS console and skip steps 4 and 5.
1. Remove the old DynamoDB tables from the Terraform state (make sure to replace the name of the vault module you've used - `module.vault`)

```bash
terraform state rm module.vault.module.main_dynamodb_table.aws_dynamodb_table.vault_dynamodb_table
terraform state rm module.vault.module.replica_dynamodb_table.aws_dynamodb_table.vault_dynamodb_table
terraform state rm module.vault.aws_dynamodb_global_table.vault_global_table
```

1. Apply terraform targeting just the global table resource. This will create your new DynamoDB tables

```bash
terraform apply -target module.vault.aws_dynamodb_global_table.vault_global_table -var-file myvars.tfvars
```

1. Import the Vault data from S3 to the newly created table. Same as with the export, you can do this with Data Pipeline following [this guide from AWS](https://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-importexport-ddb-part1.html). *Note that as of this writing, the import data pipeline doesn't compute correctly the write capacity of a global table, as it adds the write capacity of all the tables belonging to the global table. So if there are two tables in a global table, both with a provisioned write capacity of 40, the data pipeline will assume the table has a provisioned write capacity of 80, and in consequence there will be a lot of throttled write requests. A workaround is to set the DynamoDB write throughput ratio of the pipeline to 0.5*
1. After the import is complete, taint the Vault instances so they are replaced with the new DynamoDB table name. You have to do this, as Terraform ignores any changes on the instances user-data, which is where the DynamoDB table name is set. **Note that doing this will force a replace of those instances in your next terraform apply**
```bash
terraform taint -module ha_vault.vault1 aws_instance.instance
terraform taint -module ha_vault.vault2 aws_instance.instance
```
1. Apply terraform. This should make the following changes:
- Modify the IAM policy to grant access to the new DynamoDB table
- Replace both Vault instances
Expand Down

0 comments on commit 3047b26

Please sign in to comment.