Efficient memory management in R

2020-06-03 04:28发布

I have 6 GB memory in my machine (Windows 7 Pro 64 bit) and in R, I get

> memory.limit()
6141

Of course, when dealing with big data, memory allocation error occurs. So in order to make R to use virtual memory, I use

> memory.limit(50000)

Now, when running my script, I don't have memory allocation error any more, but R hogs all the memory in my computer so I can't use the machine until the script is finished. I wonder if there is a better way to make R manage memory of the machine. I think something it can do is to use virtual memory if it is using physical memory more than user specified. Is there any option like that?

3条回答
狗以群分
2楼-- · 2020-06-03 05:12

Look at the ff and bigmemory packages. This uses functions that know about R objects to keep them on disk rather than letting the OS (which just knows about chunks of memory, but not what they represent).

查看更多
Root(大扎)
3楼-- · 2020-06-03 05:15

This is not a solution but a suggestion. Use memory efficient objects wherever possible: for instance, use a matrix instead of a data.frame.

Here an example

m = matrix(rnorm(1000), 2, 2)
d = as.data.frame(m)
object.size(m)
232 bytes
object.size(d)
808 bytes
查看更多
我命由我不由天
4楼-- · 2020-06-03 05:17

R doesn't manage the memory of the machine. That is the responsibility of the operating system. The only reason memory.size and memory.limit exist on Windows is because (from help("Memory-limits")):

 Under Windows, R imposes limits on the total memory allocation
 available to a single session as the OS provides no way to do so:
 see 'memory.size' and 'memory.limit'.

R objects also have to occupy contiguous space in RAM, so you can run into memory allocation issues with only a few large objects. You could probably be more careful with the number/size of objects you create and avoid using so much memory.

查看更多
登录 后发表回答