I am looking for an efficient (both computer resource wise and learning/implementation wise) method to merge two larger (size>1 million / 300 KB RData file) data frames.
"merge" in base R and "join" in plyr appear to use up all my memory effectively crashing my system.
Example
load test data frame
and try
test.merged<-merge(test, test)
or
test.merged<-join(test, test, type="all")
-
-
The following post provides a list of merge and alternatives:
How to join (merge) data frames (inner, outer, left, right)?
The following allows object size inspection:
https://heuristically.wordpress.com/2010/01/04/r-memory-usage-statistics-variable/
Data produced by anonym
Do you have to do the merge in R? If not, merge the underlying data files using a simple file concatenation and then load them into R. (I realize this may not apply to your situation -- but if it does, it could save you a lot of headache.)
Here's the obligatory
data.table
example:Here are some timings for the data.table vs. data.frame methods.
Using data.table is very much faster. Regarding memory, I can informally report that the two methods are very similar (within 20%) in RAM use.