Why is Erlang crashing on large sequences?

2019-03-16 15:07发布

问题:

I have just started learning Erlang and am trying out some Project Euler problems to get started. However, I seem to be able to do any operations on large sequences without crashing the erlang shell.

Ie.,even this:

list:seq(1,64000000).

crashes erlang, with the error:

eheap_alloc: Cannot allocate 467078560 bytes of memory (of type "heap").

Actually # of bytes varies of course.

Now half a gig is a lot of memory, but a system with 4 gigs of RAM and plenty of space for virtual memory should be able to handle it.

Is there a way to let erlang use more memory?

回答1:

Your OS may have a default limit on the size of a user process. On Linux you can change this with ulimit.

You probably want to iterate over these 64000000 numbers without needing them all in memory at once. Lazy lists let you write code similar in style to the list-all-at-once code:

-module(lazy).
-export([seq/2]).

seq(M, N) when M =< N ->
    fun() -> [M | seq(M+1, N)] end;
seq(_, _) ->
    fun () -> [] end.

1> Ns = lazy:seq(1, 64000000).
#Fun<lazy.0.26378159>
2> hd(Ns()).
1
3> Ns2 = tl(Ns()).
#Fun<lazy.0.26378159>
4> hd(Ns2()).
2


回答2:

Possibly a noob answer (I'm a Java dev), but the JVM artificially limits the amount of memory to help detect memory leaks more easily. Perhaps erlang has similar restrictions in place?



回答3:

This is a feature. We do not want one processes to consume all memory. It like the fuse box in your house. For the safety of us all.

You have to know erlangs recovery model to understand way they let the process just die.



回答4:

Also, both windows and linux have limits on the maximum amount of memory an image can occupy As I recall on linux it is half a gigabyte.

The real question is why these operations aren't being done lazily ;)