Here's my code
#include<stdio.h>
#define ROW 10000
#define COLUMN 10000
void hello(int arr[ROW][COLUMN]){
printf("hoho");
}
void main(){
int arr[ROW][COLUMN];
hello(arr);
}
Now, this gives me segmentation fault. My question is, that I understand that while making a function call, stack is used to keep all the variables passed to the function. So is this the stack of OS? i.e. does OS has a separate memory block designed specially for this?
Also, is the size of stack fixed?
What if I have to pass such great values to my functions?
The OS has a separate stack for all of its tasks. It would be terrible, if you could corrupt the OS memory so easily.
Depending on your compiler, you usually have a bout 1 MiB of stack memory. If you need to use such large amounts of memory, use malloc
or calloc
for allocating memory from the heap.
Edit
This is what Windows memory layout looks like.
Here is the article on this.
"It depends". C doesn't even specify that there is anything called "a stack" being used for either local variables or calls.
In practice, yes, it's the process' "real" operating system-created stack that is being used, since that generally has hardware support (most processors have the instructions to implement a stack very efficiently).
Typically, large arrays are best allocated "on the heap", i.e. with malloc()
.
if you are in linux
ulimit -a
this is the command to see stack size along with other options
ulimit -s unlimited
will set the stack size to unlimited size
in my machine it was
stack size (kbytes, -s) 8192
where it produced a seg fault and on altering the stack size to unlmited it did work fine
may be
man -a ulimit
is the option to resolve this in program logic
P.S - this is specific to linux environment
Definitely not. Applications work at user mode, and has its own memory space. For modern OS, each application works at its own memory space, one wouldn't write/read to another's memory (unless using shared memory).
When you put too much values, you get stack-over-flow.