We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
cp
I am loving this workflow, but something I'd like to do is inverse the cp operation, like this:
- uses: keithweaver/[email protected] with: command: cp source: s3://${{ secrets.AWS_PROD_BUCKET }}/assets destination: ./applications/composed-storybook/storybook-static/versions aws_access_key_id: ${{ secrets.AWS_PROD_S3_UPLOADER_ACCESS_KEY }} aws_secret_access_key: ${{ secrets.AWS_PROD_S3_UPLOADER_SECRET_KEY }} aws_region: us-east-1
so source being a copy-from, destination being a copy-to.
source
destination
I could probably avoid doing this if I could set the output of ls to a variable or something. Any help appreciated!
ls
The text was updated successfully, but these errors were encountered:
Hi, i was also looking for something similar to be able to download, i didn't find anything on action demo https://github.com/keithweaver/aws-s3-github-action-demo/blob/master/.github/workflows/main.yml
Sorry, something went wrong.
Hey, I tested this by simply switching source with destination and it works 👍🏻 ...at least with a single file:
- name: fetch dicom test data uses: keithweaver/[email protected] with: command: cp source: s3://bucket/file.zip destination: ./path/file.zip ...
it works with multiple files as well. by using the recursive flag you can download all files from a folder.
No branches or pull requests
I am loving this workflow, but something I'd like to do is inverse the
cp
operation, like this:so
source
being a copy-from,destination
being a copy-to.I could probably avoid doing this if I could set the output of
ls
to a variable or something. Any help appreciated!The text was updated successfully, but these errors were encountered: