Today I was helping a friend of mine with some C code, and I've found some strange behavior that I couldn't explain him why it was happening. We had TSV file with a list of integers, with an int each line. The first line was the number of lines the list had.
We also had a c file with a very simple "readfile". The first line was read to n, the number of lines, then there was an initialization of:
int list[n]
and finally a for loop of n with a fscanf.
For small n's (till ~100.000), everything was fine. However, we've found that when n was big (10^6), a segfault would occur.
Finally, we changed the list initialization to
int *list = malloc(n*sizeof(int))
and everything when well, even with very large n.
Can someone explain why this occurred? what was causing the segfault with int list[n], that was stopped when we start using list = malloc(n*sizeof(int))?
It is an example of statically allocated array and at the compile time the size of the array will be known. And the array will be allocated on the stack.
It is an example of dynamically allocated array and the size of the array will be known to user at the run time. And the array will be allocated on the heap.
Allocates space for
n
integers on the stack, which is usually pretty small. Using memory on the stack is much faster than the alternative, but it is quite small and it is easy to overflow the stack (i.e. allocate too much memory) if you do things like allocate huge arrays or do recursion too deeply. You do not have to manually deallocate memory allocated this way, it is done by the compiler when the array goes out of scope.malloc
on the other hand allocates space in the heap, which is usually very large compared to the stack. You will have to allocate a much larger amount of memory on the heap to exhaust it, but it is a lot slower to allocate memory on the heap than it is on the stack, and you must deallocate it manually viafree
when you are done using it.int list[n]
is a VLA, which allocates on the stack instead of on the heap. You don't have to free it (it frees automatically at the end of the function call) and it allocates quickly but the storage space is very limited, as you have discovered. You must allocate larger values on the heap.There are several different pieces at play here.
The first is the difference between declaring an array as
and
In the first version, you are declaring an object with automatic storage duration. This means that the array lives only as long as the function that calls it exists. In the second version, you are getting memory with dynamic storage duration, which means that it will exist until it is explicitly deallocated with
free
.The reason that the second version works here is an implementation detail of how C is usually compiled. Typically, C memory is split into several regions, including the stack (for function calls and local variables) and the heap (for
malloc
ed objects). The stack typically has a much smaller size than the heap; usually it's something like 8MB. As a result, if you try to allocate a huge array withThen you might exceed the stack's storage space, causing the segfault. On the other hand, the heap usually has a huge size (say, as much space as is free on the system), and so
malloc
ing a large object won't cause an out-of-memory error.In general, be careful with variable-length arrays in C. They can easily exceed stack size. Prefer
malloc
unless you know the size is small or that you really only do want the array for a short period of time.Hope this helps!
This declaration allocates memory on the stack
malloc allocates on the heap.
Stack size is usually smaller than heap, so if you allocate too much memory on the stack you get a stackoverflow.
See also this answer for further information
int list[n] stores the data in the stack, while malloc stores it in the heap.
The stack is limited, and there is not much space, while the heap is much much bigger.