GZip every file separately

2020-02-20 07:02发布

问题:

How can we GZip every file separately?

I don't want to have all of the files in a big tar.

回答1:

You can use gzip *


Note:

  • This will zip each file individually and DELETE the original.
  • Use -k (--keep) option to keep the original files.
  • This may not work if you have a huge number of files due to limits of the shell
  • To run gzip in parallel see @MarkSetchell's answer below.


回答2:

Easy and very fast answer that will use all your CPU cores in parallel:

parallel gzip ::: *

GNU Parallel is a fantastic tool that should be used far more in this world where CPUs are only getting more cores rather than more speed. There are loads of examples that we would all do well to take 10 minutes to read... here



回答3:

After seven years, this highly upvoted comment still doesn't have its own full-fledged answer, so I'm promoting it now:

gzip -r .

This has two advantages over the currently accepted answer: it works recursively if there are any subdirectories, and it won't fail from Argument list too long if the number of files is very large.



回答4:

If you want to gzip every file recursively, you could use find piped to xargs:

$ find . -type f -print0 | xargs -0r gzip


回答5:

Try a loop

$ for file in *; do gzip "$file"; done


回答6:

Or, if you have pigz (gzip utility that parallelizes compression over multiple processors and cores)

pigz *


标签: linux bash gzip