Cloud Functions for Firebase killed due to memory

2020-05-14 14:07发布

I keep getting a sporadic error from Cloud Functions for Firebase when converting a relatively small image (2mb). When successful, the function only takes about 2000ms or less to finish, and according to Image Magick documentation should I should not see any problems.

I tried increasing the buffer size for the command, which isn't allows from within Firebase, and I tried to find alternatives to .spawn() as that could be overloaded with garbage and slow things down. Nothing works.

10条回答
够拽才男人
2楼-- · 2020-05-14 14:24

Update: It looks that they now preserve settings on re-deploy so you can safely change memory allocation in cloud console!

查看更多
相关推荐>>
3楼-- · 2020-05-14 14:30

I was lost in the UI, couldn't find any option to change the memory, but finally found it:

  1. Go to the Google Cloud Platform Console (not the Firebase console)
  2. Select Cloud Functions in the menu
  3. Now you see your firebase function in here if it's correct. Otherwise check if you selected the right project.
  4. Ignore all checkboxes, buttons and menu items, just click on the name of the function.
  5. Click on edit (top menu) and only change the allocated memory and click save.
查看更多
霸刀☆藐视天下
4楼-- · 2020-05-14 14:32

You can adjust your memory here:

enter image description here

查看更多
劫难
5楼-- · 2020-05-14 14:34

Another option here would be to avoid using .spawn() altogether.

There is a great image processing package for node called Sharp that uses the low-memory footprint library libvips. You can check out the Cloud Function sample on Github.

Alternately, there is a Node wrapper for ImageMagick (and GraphicsMagick) called gm. It even supports the -limit option to report your resource limitations to IM.

查看更多
对你真心纯属浪费
6楼-- · 2020-05-14 14:39

you can add the configurations in your firebase functions definitions something like:

functions.runWith({memory: '2GB', timeoutSeconds: '360'})
查看更多
Root(大扎)
7楼-- · 2020-05-14 14:40

[update] As one commenter suggested, this should no longer be an issue, as firebase functions now maintain their settings on re-deploy. Thanks firebase!

Turns out, and this is not obvious or documented, you can increase the memory allocation to your functions in the Google Functions Console. You can also increase the timeout for long-running functions. It solved the problem with memory overload and everything is working great now.

Edit: Note that Firebase will reset your default values on deploy, so you should remember to login to the console and update them right away. I am still looking around for a way to update these settings via CLI, will update when I find it.

查看更多
登录 后发表回答