I am running into issues trying to use large objects in R. For example:
> memory.limit(4000)
> a = matrix(NA, 1500000, 60)
> a = matrix(NA, 2500000, 60)
> a = matrix(NA, 3500000, 60)
Error: cannot allocate vector of size 801.1 Mb
> a = matrix(NA, 2500000, 60)
Error: cannot allocate vector of size 572.2 Mb # Can't go smaller anymore
> rm(list=ls(all=TRUE))
> a = matrix(NA, 3500000, 60) # Now it works
> b = matrix(NA, 3500000, 60)
Error: cannot allocate vector of size 801.1 Mb # But that is all there is room for
I understand that this is related to the difficulty of obtaining contiguous blocks of memory (from here):
Error messages beginning cannot allocate vector of size indicate a failure to obtain memory, either because the size exceeded the address-space limit for a process or, more likely, because the system was unable to provide the memory. Note that on a 32-bit build there may well be enough free memory available, but not a large enough contiguous block of address space into which to map it.
How can I get around this? My main difficulty is that I get to a certain point in my script and R can't allocate 200-300 Mb for an object... I can't really pre-allocate the block because I need the memory for other processing. This happens even when I dilligently remove unneeded objects.
EDIT: Yes, sorry: Windows XP SP3, 4Gb RAM, R 2.12.0:
> sessionInfo()
R version 2.12.0 (2010-10-15)
Platform: i386-pc-mingw32/i386 (32-bit)
locale:
[1] LC_COLLATE=English_Caribbean.1252 LC_CTYPE=English_Caribbean.1252
[3] LC_MONETARY=English_Caribbean.1252 LC_NUMERIC=C
[5] LC_TIME=English_Caribbean.1252
attached base packages:
[1] stats graphics grDevices utils datasets methods base
Consider whether you really need all this data explicitly, or can the matrix be sparse? There is good support in R (see
Matrix
package for e.g.) for sparse matrices.Keep all other processes and objects in R to a minimum when you need to make objects of this size. Use
gc()
to clear now unused memory, or, better only create the object you need in one session.If the above cannot help, get a 64-bit machine with as much RAM as you can afford, and install 64-bit R.
If you cannot do that there are many online services for remote computing.
If you cannot do that the memory-mapping tools like package
ff
(orbigmemory
as Sascha mentions) will help you build a new solution. In my limited experienceff
is the more advanced package, but you should read theHigh Performance Computing
topic on CRAN Task Views.The save/load method mentioned above works for me. I am not sure how/if
gc()
defrags the memory but this seems to work.The simplest way to sidestep this limitation is to switch to 64 bit R.
I encountered a similar problem, and I used 2 flash drives as 'ReadyBoost'. The two drives gave additional 8GB boost of memory (for cache) and it solved the problem and also increased the speed of the system as a whole. To use Readyboost, right click on the drive, go to properties and select 'ReadyBoost' and select 'use this device' radio button and click apply or ok to configure.
For Windows users, the following helped me a lot to understand some memory limitations:
gc()
to do garbage collection => it works, I can see the memory use go down to 2 GBAdditional advice that works on my machine:
I followed to the help page of memor.limit and found out that on my computer R by default can use up to ~ 1.5 GB of RAM and that the user can increase this limit. Using the following code,
helped me to solve my problem.