I ran into a funny issue today working with large data structures. I initially was using a vector to store upwards of 1000000 ints but later decided I didn't actually need the dynamic functionality of the vector (I was reserving 1000000 spots as soon as it was declared anyway) and it would be beneficial to, instead, be able to add values any place in the data structure. So I switched it to an array and BAM stack overflow. I'm guessing this is because declaring the size of the array at compile time puts it in the stack and making use of a dynamic vector instead placed it on the heap (which I'm guessing is larger?).
So what's the right answer here? Move back to a dynamic memory system just so it gets put on the heap? Increase the size of the stack? Or am I way off base on the whole thing here...?
Thanks!
I initially was using a vector to store upwards of 1000000 ints
Good idea.
but later decided I didn't actually need the dynamic functionality of the vector (I was reserving 1000000 spots as soon as it was declared anyway)
Not such a good idea. You did need it.
and it would be beneficial to, instead, be able to add values any place in the data structure.
I don't follow.
I'm guessing this is because declaring the size of the array at compile time puts it in the stack and making use of a dynamic vector instead placed it on the heap (which I'm guessing is larger?).
Much. The call stack is typically of the order of 1MB-2MB in size by default. Your "heap" (free store) is only really bounded by your available RAM.
So what's the right answer here? Move back to a dynamic memory system just so it gets put on the heap?
Yes.
[edit: Joachim's right — static
is another possible answer.]
Increase the size of the stack?
You could but even if you could stretch 4MB out of it, you've left yourself no wiggle room for other local data variables. Best use dynamic memory — that's the appropriate thing to do.
Or am I way off base on the whole thing here...?
No.