I've thousands of png files which I like to make smaller with pngcrush
. I've a simple find .. -exec
job, but it's sequential. My machine has quite some resources and I'd make this in parallel.
The operation to be performed on every png is:
pngcrush input output && mv output input
Ideally I can specify the maximum number of parallel operations.
Is there a way to do this with bash and/or other shell helpers? I'm Ubuntu or Debian.
You can use custom
find/xargs
solutions (see Bart Sas' answer), but when things become more complex you have -at least- two powerful options:parallel
(from package moreutils)With GNU Parallel http://www.gnu.org/software/parallel/ it can be done like:
Learn more:
You can use
xargs
to run multiple processes in parallel:xargs
will read the list of files produced by find (separated by 0 characters (-0
)) and run the provided command (sh -c '...' sh
) with one parameter at a time (-n 1
). xargs will run<nr_procs>
(-P <nr_procs>
) in parallel.