I am looking to gzip multiple files (into multiple .gz files) in a directory while keeping the originals.
I can do individual files using these commands:
find . -type f -name "*cache.html" -exec gzip {} \;
or
gzip *cache.html
but neither preserves the original. I tried
find . -type f -name "*cache.html" -exec gzip -c {} > {}.gz
but that only made a {}.gz file. Is there a simple way to do this?
-k, --keep
gzip 1.6 (June 2013) added the
-k, --keep
option, so now you can:or for all files recursively simply:
Found at: https://unix.stackexchange.com/questions/46786/how-to-tell-gzip-to-keep-original-file
Since you have multiple files, GNU Parallel might be useful:
Watch the intro video for a quick introduction: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Your
>
in the last command gets parsed by the same shell which runsfind
. Use a nested shell:I'd use
bash(1)
's simplefor
construct for this:If I knew the filenames were 'sane', I'd leave off the
""
around the arguments, because I'm lazy. And my filenames are usually sane. But scripts don't have that luxury.