I recently got to know an integer takes 4 bytes from the memory.
First ran this code, and measured the memory usage:
int main()
{
int *pointer;
}
- It took 144KB.
Then I modified the code to allocate 1000 integer variables.
int main()
{
int *pointer;
for (int n=0; n < 1000; n++)
{
pointer = new int ;
}
}
- Then it took (168-144=) 24KB
but 1000 integers are suppose to occupy (4bytes x 1000=) 3.9KB.
Then I decided to make 262,144 integer variables which should consume 1MB of memory.
int main()
{
int *pointer;
for (int n=0; n < 262144; n++)
{
pointer = new int ;
}
}
Surprisingly, now it takes 8MB
Memory usage, exponentially grows respective to the number of integers.
Why is this happening?
I'm on Kubuntu 13.04 (amd64)
Please give me a little explanation.
Thanks!
NOTE: sizeof(integer)
returns 4
Memory for individually allocated dynamic objects is not required to be contiguous. In fact, due to the alignment requirements for
new char[N]
(namely to be aligned atalignof(std::maxalign_t)
, which is usually 16), the standard memory allocator might just never bother to return anything but 16-byte aligned memory. So eachint
allocation actually consumes (at least) 16 bytes. (And further memory may be required by the allocator for internal bookkeeping.)The moral is of course that you should be using
std::vector<int>(1000000)
to get a sensible handle on one million dynamic integers.I think it depends on how the compiler creates the output program.
The memory usage of a program includes all the sections of the program (like .text, which contains the assembly directives of the program), so it takes some memory in space, when it's loaded.
And for more variables, the memory isn't really contiguous when you allocate some memory (memory-alignment), so it could take more memory than you think it takes.
Two explanations:
new
you perform dynamic allocation.-g
compiler flag) your memory usage may be larger than expected.