The following code gives me a segmentation fault when run on a 2Gb machine, but works on a 4GB machine.
int main()
{
int c[1000000];
cout << "done\n";
return 0;
}
The size of the array is just 4Mb. Is there a limit on the size of an array that can be used in c++?
Also, if you are running in most UNIX & Linux systems you can temporarily increase the stack size by the following command:
But be careful, memory is a limited resource and with great power come great responsibilities :)
You're probably just getting a stack overflow here. The array is too big to fit in your program's stack address space.
If you allocate the array on the heap you should be fine, assuming your machine has enough memory.
int* array = new int[1000000];
But remember that this will require you to
delete[]
the array. A better solution would be to usestd::vector<int>
and resize it to 1000000 elements.In C or C++ local objects are usually allocated on the stack. You are allocating a large array on the stack, more than the stack can handle, so you are getting a stackoverflow.
Don't allocate it local on stack, use some other place instead. This can be achieved by either making the object global or allocating it on the global heap. Global variables are fine, if you don't use the from any other compilation unit. To make sure this doesn't happen by accident, add a static storage specifier, otherwise just use the heap.
This will allocate in the BSS segment, which is a part of the heap:
This will allocate in the DATA segment, which is a part of the heap too:
This will allocate at some unspecified location in the heap:
You array is being allocated on the stack in this case attempt to allocate an array of the same size using alloc.
Because you store the array in the stack. You should store it in the heap. See this link to understand the concept of the heap and the stack.