Based on this Medium post and this repo.
- A Digital Ocean account
- Terraform version 1.0.11
- Clone this repository. (Change bucket region if necessary, both in the variables and backend section.)
- Export the needed environment variables:
export TF_VAR_do_token=<YOUR_DO_PERSONAL_ACCESS_TOKEN>
export TF_VAR_spaces_access_key_id=<YOUR_DO_SPACES_ACCESS_KEY_ID>
export TF_VAR_spaces_secret_key=<YOUR_DO_SPACES_SECRET_KEY>
export AWS_ACCESS_KEY_ID=<YOUR_DO_SPACES_ACCESS_KEY_ID>
export AWS_SECRET_ACCESS_KEY=<YOUR_DO_SPACES_SECRET_KEY>
- comment out the
backend
block in the `terraform block
- Run
terraform init
. The statefile will be generated locally - Run
terraform plan
. You should see that an DO S3 bucket will be created - Run
terraform apply
to create the S3 bucket - Add the previously-commented-out backend section again.
- Run
terraform init
again. You will be prompted to ask if you want to copy the local state to the remote bucket. Pickyes
. - Run
terraform plan
. Your plan should come up empty. - Rejoice, you have bootstrapped a terraform with a remote state hosted on Digital Ocean.
Add the following secrets, either on repository-level or on organization-level:
- DO_SPACES_ACCESS_KEY_ID
- DO_SPACES_SECRET_ACCESS_KEY
- DO_ACCESS_TOKEN
The actions are configured to only run on a manual workflow_dispatch
trigger.