GitHub Action
AWS S3 Github Action
Upload, download, or list files/folders through Github Actions.
- uses: keithweaver/[email protected]
with:
command: cp
source: ./local_file.txt
destination: s3://yourbucket/folder/local_file.txt
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws_region: us-east-1
Inputs
Variable name | Required/Optional | Default | Description |
---|---|---|---|
command |
Optional | cp |
This is the command that is being performed. When using the AWS CLI, it's the portion following the service. aws s3 cp ... <- cp , aws s3 ls <- ls |
source |
Required | N/A | Depending on the command, this could be the directory you are requesting list, or the source file. |
destination |
Required for cp , mv and sync |
N/A | The location where you want the file to arrive. |
aws_access_key_id |
Optional | N/A | This is the credentials from an IAM role for getting access to a bucket. More info |
aws_secret_access_key |
Optional | N/A | This is the credentials from an IAM role for getting access to a bucket. More info |
aws_session_token |
Optional | N/A | This is the credentials from an IAM role for getting access to a bucket. More info |
aws_region |
Optional | N/A | This is the region of the bucket. S3 namespace is global but the bucket is regional. |
metadata_service_timeout |
Optional | N/A | The number of seconds to wait until the metadata service request times out. More info |
flags |
Optional | N/A | Additional query flags. |
Where can I see this run in a pipeline as an example?
Here is the test/verification pipeline that is used.
How can I use a specific version or test a feature branch?
You are specifying the tag or branch by using the @
after the Action name. Below, it uses v1.0.0
which is based on the tag.
- uses: keithweaver/[email protected]
...
This uses the master branch:
- uses: keithweaver/aws-s3-github-action@master
This uses a feature branch called dev-branch
:
- uses: keithweaver/aws-s3-github-action@dev-branch
It is recommended that you point to a specific version to avoid unexpected changes affecting your workflow.
Can I run this local with Docker?
# You should have Docker on your local and running.
docker build . -t aws-s3-action
docker run \
--env INPUT_AWS_ACCESS_KEY_ID="<ACCESS_KEY>" \
--env INPUT_AWS_SECRET_ACCESS_KEY="<ACCESS_SECRET>" \
--env INPUT_SOURCE="./sample.txt" \
--env INPUT_DESTINATION="s3://yourbucket/sample.txt" \
aws-s3-action
# Docker image must follow the environment variables or they will not set.
Can I run this local outside of Docker?
You can run a bash script
INPUT_AWS_ACCESS_KEY_ID="<ACCESS_KEY>" \
INPUT_AWS_SECRET_ACCESS_KEY="<ACCESS_SECRET>" \
INPUT_SOURCE="./sample.txt" \
INPUT_DESTINATION="s3://yourbucket/sample.txt" \
bash entrypoint.sh
upload failed: ./test1.txt to s3://.../test1.txt Unable to locate credentials
You didn't set credentials correctly. Common reason; forgot to set the Github Secrets.
An error occurred (SignatureDoesNotMatch) when calling the PutObject operation: The request signature we calculated does not match the signature you provided. Check your key and signing method.
Solution is here. More info, more.
botocore.utils.BadIMDSRequestError
Here is the solution. We set the AWS region as a required argument as a result.
upload failed: folder1/ to s3://.../folder1/ [Errno 21] Is a directory: '/github/workspace/folder1/'
You need to a recursive flag for the cp
. Looks like:
- uses: keithweaver/[email protected]
name: Copy Folder
with:
command: cp
source: ./folder1/
destination: s3://bucket/folder1/
aws_access_key_id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws_secret_access_key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws_region: us-east-1
flags: --recursive
An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
fatal error: An error occurred (404) when calling the HeadObject operation: Key "verify-aws-s3-action/folder1/" does not exist
You need to add a recursive flag, flags: --recursive
.