I have a bunch of R functions which I need to call through python. However, I reach memory errors when I try to allocate a large matrix. The same functions run fine on RStudio on the same computer. Here is a code chunk which crashes:
#python:
import rpy2.robjects as ro
import gc
gc.collect()
ro.r.source("calibration_functions.R")
result1 = ro.r.func1() #Does some calculations, works fine.
result2 = ro.r.func2(result1) #Crashes at this step
#R code:
func2 <- function(result1){
preds_mat = matrix(data=NA, nrow = 263310, ncol = 1000)
# do something...
return(preds_mat)
}
The error I get is: RRuntimeError: Error: cannot allocate vector of size 1004.4 Mb
How can I clean the R memory? gc() or gc.collect() doesn't work.
I typically deal with the issue by allocating more memory
(...)
May be the same function, but likely with differences in memory usage from other applications.
Your R function
func2()
returns the following object size:This is about 1.05Gb.
The error observed is likely occurring either because of what is happening in the unspecified function
func1()
, or because something has changed between the execution in RStudio and the execution in rpy2.To clean the R memory: