I am trying to measure memory consumption of a running program in Linux. I wrote a C program to allocate 1G memory, then use time to output its "Maximum resident set size":
/usr/bin/time -f '%Uu %Ss %er %MkB %x %C' ./takeMem 1000000000
0.85u 0.81s 1.68r **3910016kB** 0 ./takeMem 1000000000
From man time
, I should interpret that "Maximum resident set size" for such program take 3.9G memory although the program allocated only 1G memory. It does NOT make sense.
Can anybody known what happened to cause "Maximum resident set size" that high?
The C code is quite simple:
#include <stdio.h>
#include <stdlib.h>
int main(int argc, char *argv[])
{
int memLength = atoi(argv[1]);
fprintf(stderr, "Allocating %d memory...", memLength);
unsigned char* p = new unsigned char[memLength];
fprintf(stderr, "Done\n");
while (true) {
int i = rand() % memLength;
char v = rand() % 256;
p[i] = v;
}
return 0;
}