You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm currently setting up a GitHub action based on gzipper and noticed that until now, there is no option to simply copy the original files to the (newly created) outputPath directory first. This would come in really handy for immediate deployment however.
Hence I propose --copy-original-files flag for this purpose as a possible enhancement.
If anyone needs a workaround follow these step:
Copy the uncompressed, original files to the compressed-branch
Use this branch as in- and output (in my case gh-pages)
Step 2. can be done like this:
# check out other branch (gh-pages)
- uses: actions/checkout@v2with:
ref: gh-pages# compress all files in all directories
- run: gzipper compress --include htm,html,css,js,svg,xml,map,json,img,png,jpg,jpeg --zopfli --brotli --remove-larger .# commit
- uses: stefanzweifel/git-auto-commit-action@v4with:
commit_message: Compress Filesbranch: gh-pagespush_options: '--force'
Thanks a lot for this amazing repo!
I'm currently setting up a GitHub action based on gzipper and noticed that until now, there is no option to simply copy the original files to the (newly created)
outputPath
directory first. This would come in really handy for immediate deployment however.Hence I propose
--copy-original-files
flag for this purpose as a possible enhancement.If anyone needs a workaround follow these step:
gh-pages
)Step 2. can be done like this:
Find the full action with
mkdocs deploy
here.The text was updated successfully, but these errors were encountered: