I keep getting a sporadic error from Cloud Functions for Firebase when converting a relatively small image (2mb). When successful, the function only takes about 2000ms or less to finish, and according to Image Magick documentation should I should not see any problems.
I tried increasing the buffer size for the command, which isn't allows from within Firebase, and I tried to find alternatives to .spawn()
as that could be overloaded with garbage and slow things down. Nothing works.
Update: It looks that they now preserve settings on re-deploy so you can safely change memory allocation in cloud console!
I was lost in the UI, couldn't find any option to change the memory, but finally found it:
You can adjust your memory here:
Another option here would be to avoid using
.spawn()
altogether.There is a great image processing package for node called Sharp that uses the low-memory footprint library libvips. You can check out the Cloud Function sample on Github.
Alternately, there is a Node wrapper for ImageMagick (and GraphicsMagick) called gm. It even supports the -limit option to report your resource limitations to IM.
you can add the configurations in your firebase functions definitions something like:
[update] As one commenter suggested, this should no longer be an issue, as firebase functions now maintain their settings on re-deploy. Thanks firebase!
Turns out, and this is not obvious or documented, you can increase the memory allocation to your functions in the Google Functions Console. You can also increase the timeout for long-running functions. It solved the problem with memory overload and everything is working great now.
Edit: Note that Firebase will reset your default values on deploy, so you should remember to login to the console and update them right away. I am still looking around for a way to update these settings via CLI, will update when I find it.