I'm running 64 bit R on Ubuntu 12.10 AMD64. I recently added additional 8GB of memory to my system making it a total of 12GB. But I notice that R gives me an error whenever the memory usage (of a single R session) goes above 4GB. When I ran 6 R sessions in parallel, each consuming ~ 3 GB of memory, my over all memory usage increased up to 11 GB. But a single R session is not able to use more than 4GB! I need to train a random forest model over a large data set and I need > 4GB with a single R session.
Update:
R> sessionInfo()
R version 2.15.1 (2012-06-22)
Platform: x86_64-pc-linux-gnu (64-bit)
locale:
[1] LC_CTYPE=en_US.UTF-8 LC_NUMERIC=C
[3] LC_TIME=en_US.UTF-8 LC_COLLATE=en_US.UTF-8
[5] LC_MONETARY=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8
[7] LC_PAPER=C LC_NAME=C
[9] LC_ADDRESS=C LC_TELEPHONE=C
[11] LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C
attached base packages:
[1] stats graphics grDevices utils datasets methods
[7] base
loaded via a namespace (and not attached):
[1] tools_2.15.1
Update 2:
$ ulimit -a
core file size (blocks, -c) 0
data seg size (kbytes, -d) unlimited
scheduling priority (-e) 0
file size (blocks, -f) unlimited
pending signals (-i) 92787
max locked memory (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files (-n) 1024
pipe size (512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority (-r) 0
stack size (kbytes, -s) 8192
cpu time (seconds, -t) unlimited
max user processes (-u) 92787
virtual memory (kbytes, -v) unlimited
file locks (-x) unlimited