I am working on a site that uses ImageMagick to generate images. The site will get hundreds of request every minute and using ImageMagick to do this causes the site to crash.
So we implemented Redis and Php-resque to do the ImageMagick generating in the background on a seperate server so that it doesn't crash our main one. The problem is that it's still taking a very long time to get images done. A user might expect to wait up to 2-3 minutes for an image request because the server is so busy processing these images.
I am not sure what information to give you, but I'm more looking for advice. I think if we can cut down the initial process time for the ImageMagick request, then obviously this will help speed up the amount of images we can process.
Below is a sample of the ImageMagick script that we use:
convert -size 600x400 xc:none \( ".$path."assets/images/bases/base_image_69509021433289153_8_0.png -fill rgb\(255,15,127\) -colorize 100% \) -composite \( ".$path."assets/images/bases/eye_image_60444011438514404_8_0.png -fill rgb\(15,107,255\) -colorize 100% \) -composite \( ".$path."assets/images/markings/marking_clan_8_marking_10_1433289499.png -fill rgb\(255,79,79\) -colorize 100% \) -composite \( ".$path."assets/images/bases/shading_image_893252771433289153_8_0.png -fill rgb\(135,159,255\) -colorize 100% \) -compose Multiply -composite \( ".$path."assets/images/highlight_image_629750231433289153_8_0.png -fill rgb\(27,35,36\) -colorize 100% \) -compose Overlay -composite \( ".$path."assets/images/lineart_image_433715161433289153_8_0.png -fill rgb\(0,0,0\) -colorize 100% \) -compose Over -composite ".$path."assets/generated/queue/tempt_preview_27992_userid_0_".$filename."_file.png
My theory is that the reason this takes quite a long time is due to the process of colouring the images. Is there a way to optimise this process at all?
Anyone who has some experience with handling heavy loads of imagemagick processes or can see some glaringly easy ways to optimise our requests, I'd be very greatful.
Thank you :)
Your command actually boils down to this:
My thoughts are as follows:
Point 1:
The first
-composite
onto a blank canvas seems pointless - presumably1.png
is a 600x400 PNG with transparency, so your first line can avoid the compositing operation and save 16% of the processing time by changing to:Point 2
I put the equivalent of your command into a loop and did 100 iterations and it takes 15 seconds. I then changed all your reads of PNG files into reads of
MPC
files - or Magick Pixel Cache files. That reduced the processing time to just under 10 seconds, i.e. by 33%. A Magic Pixel Cache is just a pre-decompressed, pre-decoded file that can be read directly into memory without any CPU effort. You could pre-create them whenever your catalogue changes and store them alongside the PNG files. To make one you doand you will get out
image.mpc
andimage.cache
. Then you would simply change your code to look like this:Point 3
Unfortunately you haven't answered my questions yet, but if your assets catalogue is not too big, you could put that (or the MPC equivalents above) onto a RAM disk at system startup.
Point 4
You should definitely run in parallel - that will yield the biggest gains of all. It is very simple with GNU Parallel - example here.
If you are using REDIS, it is actually easier than that. Just
LPUSH
your MIME-encoded images into a REDIS list like this:and then run multiple workers that all sit there doing BLPOPs of jobs to do
If I run one generator process as above and have it generate 100,000 images of 200kB each, and read them out with 4 worker processes on my reasonable spec iMac, it takes 59 seconds, or around 1,700 images/s can pass through REDIS.
The queue is being processed one at a time? Have you tried to make concurrent jobs, that will keep running in parallel so you work more than one element at once if that is the case?