Batch blur images using multiple cores

2019-09-10 23:43发布

I'm trying to blur the bottom section of thousands (>50,000) of images using imagemagick. Image resolution is 800x600. The command line code (below) works, but takes a long time. Is there any way that this can be run in parallel, and hopefully called from within R using system()?

I got this code off the internet, so I'm not sure if it's the best way to even achieve this objective? Any help would be greatly appreciated. Thanks in advance!

(OS = OSX El Capitan)

cd /Users/Desktop/test_images
list=$(ls *.jpg)
for img in $list; do
    convert $img \
    \( -size 800x525 xc:black -size 800x75 xc:white -append \) \
    -compose blur -define compose:args=6 -composite \
    cd /Users/Desktop/test_images/results/$img
done
cd

1条回答
萌系小妹纸
2楼-- · 2019-09-11 00:27

I think this command does something very similar to what you are doing but is FAR quicker. See if you like the effect:

convert start.jpg \( +clone -crop +0+525 -blur x4 \) -gravity south -composite result.jpg

enter image description here

If that works, you can use GNU Parallel just as before:

parallel 'convert {} \( +clone -crop +0+525 -blur x4 \) -gravity south -composite results/{}' ::: *.jpg

You can also put that lot in a script called BlurTitle like this:

#!/bin/bash
parallel 'convert {} \( +clone -crop +0+525 -blur x4 \) -gravity south -composite results/{}' ::: *.jpg

and then make it executable with:

chmod +x BlurTitle

and call it from R with:

system("./BlurTitle")

or from the Terminal with:

./BlurTitle

If you get "Argument list too long", you can express it the other way around like this by sending the arguments on stdin rather than after the command:

cd /path/to/images
find . -name \*.jpg -print0 | parallel -0 'convert {} \( +clone -crop +0+525 -blur x4 \) -gravity south -composite results/{}'
查看更多
登录 后发表回答