How can we GZip every file separately?
I don't want to have all of the files in a big tar.
How can we GZip every file separately?
I don't want to have all of the files in a big tar.
You can use gzip *
Note:
-k
(--keep
) option to keep the original files. Easy and very fast answer that will use all your CPU cores in parallel:
parallel gzip ::: *
GNU Parallel is a fantastic tool that should be used far more in this world where CPUs are only getting more cores rather than more speed. There are loads of examples that we would all do well to take 10 minutes to read... here
After seven years, this highly upvoted comment still doesn't have its own full-fledged answer, so I'm promoting it now:
gzip -r .
This has two advantages over the currently accepted answer: it works recursively if there are any subdirectories, and it won't fail from Argument list too long
if the number of files is very large.
If you want to gzip every file recursively, you could use find piped to xargs:
$ find . -type f -print0 | xargs -0r gzip
Try a loop
$ for file in *; do gzip "$file"; done
Or, if you have pigz (gzip utility that parallelizes compression over multiple processors and cores)
pigz *