I run simulations on a Windows 64bit-computer with 64 GB RAM. Memory use reaches 55% and after a finished simulation run I remove all objects in the working space by rm(list=ls())
, followed by a double gc()
.
I supposed that this would free enough memory for the next simulation run, but actually memory usage drops by just 1%. Consulting a lot of different fora I could not find a satisfactory explanation, only vague comments such as:
"Depending on your operating system, the freed up memory might not be returned to the operating system, but kept in the process space."
I'd like to find information on:
- 1) which OS and under which conditions freed memory is not returned to the OS, and
- 2) if there is any other remedy than closing R and start it again for the next simulation run?
The
R
garbage collector is imperfect in the following (not so) subtle way: it does not move objects (i.e., it does not compact memory) because of the way it interacts withC
libraries. (Some other languages/implementations suffer from this too, but others, despite also having to interact withC
, manage to have a compacting generational GC which does not suffer from this problem).This means that if you take turns allocating small chunks of memory which are then discarded and larger chunks for more permanent objects (this is a common situation when doing string/regexp processing), then your memory becomes fragmented and the garbage collector can do nothing about it: the memory is released, but cannot be re-used because the free chunks are too short.
The only way to fix the problem is to save the objects you want, restart
R
, and reload the objects.Since you are doing
rm(list=ls())
, i.e., you do not need any objects, you do not need to save and reload anything, so, in your case, the solution is precisely what you want to avoid - restartingR
.PS. Garbage collection is a highly non-trivial topic. E.g., Ruby used 5 (!) different GC algorithms over 20 years. Java GC does not suck because Sun/Oracle and IBM spent many programmer-years on their respective implementations of the GC. On the other hand, R and Python have lousy GC - because no one bothered to invest the necessary man-years - and they are quite popular. That's worse-is-better for you.
PPS. Related: R: running out of memory using `strsplit`
How do you check memory usage? Normally virtual machine allocates some chunk of memory that it uses to store its data. Some of the allocated may be unused and marked as free. What GC does is discovering data that is not referenced from anywhere else and marking corresponding chunks of memory as unused, this does not mean that this memory is released to the OS. Still from the VM perspective there's now more free memory that can be used for further computation.
As others asked did you experience out of memory errors? If not then there's nothing to worry about.
EDIT: This and this should be enough to understand how memory allocation and garbage collection works in R.
From the first document:
EDIT2:
To see used memory try running gc() with verbose set to TRUE:
Here's a result with an array of 10'000'000 integers in memory:
And here's after discarding reference to it:
As you can see memory used by Vcells fell from 40.6Mb to 2.4Mb.