Why is Erlang crashing on large sequences?

2019-03-16 15:01发布

I have just started learning Erlang and am trying out some Project Euler problems to get started. However, I seem to be able to do any operations on large sequences without crashing the erlang shell.

Ie.,even this:

list:seq(1,64000000).

crashes erlang, with the error:

eheap_alloc: Cannot allocate 467078560 bytes of memory (of type "heap").

Actually # of bytes varies of course.

Now half a gig is a lot of memory, but a system with 4 gigs of RAM and plenty of space for virtual memory should be able to handle it.

Is there a way to let erlang use more memory?

4条回答
唯我独甜
2楼-- · 2019-03-16 15:23

Possibly a noob answer (I'm a Java dev), but the JVM artificially limits the amount of memory to help detect memory leaks more easily. Perhaps erlang has similar restrictions in place?

查看更多
smile是对你的礼貌
3楼-- · 2019-03-16 15:28

Also, both windows and linux have limits on the maximum amount of memory an image can occupy As I recall on linux it is half a gigabyte.

The real question is why these operations aren't being done lazily ;)

查看更多
放我归山
4楼-- · 2019-03-16 15:31

Your OS may have a default limit on the size of a user process. On Linux you can change this with ulimit.

You probably want to iterate over these 64000000 numbers without needing them all in memory at once. Lazy lists let you write code similar in style to the list-all-at-once code:

-module(lazy).
-export([seq/2]).

seq(M, N) when M =< N ->
    fun() -> [M | seq(M+1, N)] end;
seq(_, _) ->
    fun () -> [] end.

1> Ns = lazy:seq(1, 64000000).
#Fun<lazy.0.26378159>
2> hd(Ns()).
1
3> Ns2 = tl(Ns()).
#Fun<lazy.0.26378159>
4> hd(Ns2()).
2
查看更多
孤傲高冷的网名
5楼-- · 2019-03-16 15:31

This is a feature. We do not want one processes to consume all memory. It like the fuse box in your house. For the safety of us all.

You have to know erlangs recovery model to understand way they let the process just die.

查看更多
登录 后发表回答