This section details how to run the solution locally and deploy your code changes from the command line.
The following dependencies must be installed:
- AWS CLI (v1)
- Python >=3.9 and pip
- node.js >= v20 and npm >= 10
- virtualenv
- Ruby >= 2.6
- libsnappy-dev/snappy-devel (debian/centos)
- docker
- jq
- Java JRE
Once you have installed all pre-requisites, you must run the following command
to create a virtualenv
and install all frontend/backend dependencies before
commencing development.
make setup
This command only needs to be ran once.
To deploy the solution manually from source to your AWS account, run the following command:
make deploy \
REGION=<aws-region> \
ADMIN_EMAIL=<your-email-address> \
TEMP_BUCKET=<temp-bucket-name>
If you use KMS for client-side encryption you'll also need to pass a
KMS_KEYARNS
environment variable to the make deploy
script, containing the
comma-delimited list of KMS Key Arns used for client-side Encryption.
For information on how to obtain your subnet and security group IDs, see Configuring a VPC for the Solution.
This will deploy the Amazon S3 Find and Forget solution using the AWS CLI
profile of the current shell. By default this will be the profile default
.
The following commands are also available:
make deploy-artefacts
: Packages and uploads the Forget task Docker image and frontend React app to the solution bucket. This will trigger CodePipeline to automatically deploy these artefactsmake deploy-vpc
: Deploys only the VPC CloudFormation templatemake deploy-cfn
: Deploys only the CloudFormation templatemake redeploy-containers
: Manually packages and deploys the Forget task Docker image to ECR via the AWS CLI rather than using CodePipeline.make redeploy-frontend
: Manually packages and deploys the frontend React app to S3 via the AWS CLI rather than using CodePipeline.make start-frontend-remote
: Opens the frontend of the deployed Amazon S3 Find and Forget solution
Important: Running the frontend/forget task locally requires the solution CloudFormation stack to be deployed. For more info, see Build and Deploy From Source
To run the frontend locally, run the following commands:
make setup-frontend-local-dev
: Downloads a copy of the configuration file required for the frontend app to run locallymake start-frontend-local
: Runs the frontend app locally onlocalhost:3000
In order to allow your locally running frontend to connect to the deployed API, you will need to set the
AccessControlAllowOriginOverride
parameter to * when deploying the solution stack
To run the "Forget" task locally using Docker, run the following command:
docker build -f backend/ecs_tasks/delete_files/Dockerfile -t s3f2 .
make run-local-container ROLE_NAME=<your-sqs-access-role-name>
The container needs to connect to the deletion queue deployed by the solution
and therefore AWS credentials are required in the container environment. You
will need to setup an IAM role which has access to process messages from the
queue and provide the role name as an input. The above command will perform STS
Assume Role via the AWS CLI using ROLE_NAME
as the target role in order to
obtain temporary credentials. These temporary credentials will be injected into
the container as environment variables.
The command uses your default CLI profile to assume the role. You can override the profile being used as follows:
make run-local-container ROLE_NAME=<your-sqs-access-role-name> AWS_PROFILE=my-profile
Important: Running acceptance tests requires the solution CloudFormation stack to be deployed. For more info, see Build and Deploy From Source
The following commands are available for running tests:
make test
: Run all unit and acceptance tests for the backend and frontend.make test-acceptance-cognito
: Run all backend task acceptance tests using Cognito authenticationmake test-acceptance-iam
: Run all backend task acceptance tests using IAM authenticationmake test-cfn
: Run CloudFormation related unit testsmake test-unit
: Run all backend task unit testsmake test-frontend
: Run all frontend tests
Note: some acceptance tests require a KMS Symmetric Key to be created in advance and specified during the solution's deployment.
In this project, Python library dependencies are stored in two forms:
requirements.in
is a hand-managed file, and may contain loose version specificationsrequirements.txt
is a machine-generated file, generated by the former, which contains strict versions for all required dependencies
When running Make, if it detects a change to the requirements.in
file it will
automatically regenerate requirements.txt
for you using the latest published
versions of libraries. You can also manually trigger this by running Make with
the requirements.txt file as the target (for instance,
make ./backend/ecs_tasks/delete_files/requirements.txt
).
For advanced use-cases, you can use pip-compile
outside of Make to change
specific library versions without regenerating the entire file.