I have 17 filebacked big.matrix objects (dim 10985 x 52598, 4.3GB each) of which I would like to calculate the element-wise mean. The result can be stored in another big.matrix (gcm.res.outputM).
biganalytics::apply() doesn't work as the MARGIN can be set to 1 OR 2 only. I tried to use 2 for loops as shown here
gcm.res.outputM <- filebacked.big.matrix(10958, 52598, separated = FALSE, backingfile = "gcm.res.outputM.bin", backingpath = NULL, descriptorfile = "gcm.res.outputM.desc", binarydescriptor = FALSE)
for(i in 1:10958){
for(j in 1:52598){
t <- rbind(gcm.res.output1[i,j], gcm.res.output2[i,j],gcm.res.output3[i,j], gcm.res.output4[i,j],
gcm.res.output5[i,j], gcm.res.output6[i,j],gcm.res.output7[i,j], gcm.res.output8[i,j],
gcm.res.output9[i,j], gcm.res.output10[i,j],gcm.res.output11[i,j], gcm.res.output12[i,j],
gcm.res.output13[i,j], gcm.res.output14[i,j],gcm.res.output15[i,j], gcm.res.output16[i,j],
gcm.res.output17[i,j])
tM <- apply(t, 2, mean, na.rm = TRUE)
gcm.res.outputM[i,j] <- tM
}
}
which will take around 1.5 minutes per row i and thus about 11 days run.
Does anyone have any ideas on how to speed up this calculation? I'm using a 64x Windows10 machine with 16GB of RAM.
Thanks!
You can use this Rcpp code:
Then if you have a list of big.matrix objects of the same size, you can do: