I'm on Windows Server 2012 (64-bit) with 30.5 GB of RAM, running R v3.1.2 in RStudio 0.98, and am still having trouble with R hitting a memory limit.
I reviewed the FAQ here: http://cran.r-project.org/bin/windows/base/rw-FAQ.html#There-seems-to-be-a-limit-on-the-memory-it-uses_0021
Which states that the memory limit on 64-bit instances defaults to the total amount of RAM, and that the limit can be checked and set using memory.limit().
A call to memory.limit()
returns 31249
, confirming that it's able to see and use all 30 gigs.
However, when I make a modeling call on a large dataset (~10M rows):
ctree(as.formula(formula), data=d, control=ctree_control(mincriterion=0.9, minbucket=1000))
I get the following error:
'Calloc' could not allocate memory (18446744073673801728 of 8 bytes)
But looking at the system task manager I can see that over 25GB is still available, and that R is only using 2.3GB.
Running the modeling outside of RStudio and in R directly yields the same result, so RStudio isn't the variable.
I'm perplexed - why does R refuse to use all my memory?