clusterExport to single thread in R parallel

2019-05-21 12:28发布

问题:

I would like to split a large data.frame into chunks and pass each individually to the different members of the cluster.

Something like:

library(parallel)
cl <- makeCluster(detectCores())
for (i in 1:detectCores()) {
  clusterExport(cl, mydata[indices[[i]]], <extra option to specify a thread/process>)
}

Is this possible?

回答1:

Here is an example that uses clusterCall inside a for loop to send a different chunk of the data frame to each of the workers:

library(parallel)
cl <- makeCluster(detectCores())
df <- data.frame(a=1:10, b=1:10)
ix <- splitIndices(nrow(df), length(cl))
for (i in seq_along(cl)) {
  clusterCall(cl[i], function(d) {
    assign('mydata', d, pos=.GlobalEnv)
    NULL  # don't return any data to the master
  }, df[ix[[i]],,drop=FALSE])
}

Note that the call to clusterCall is subsetting cl in order to execute the function on a single worker each time through the for loop.

You can verify that the workers were properly initialized in this example using:

r <- do.call('rbind', clusterEvalQ(cl, mydata))
identical(df, r)

There are easier ways to do this, but this example minimizes the memory used by the master and the amount of data sent to each of the workers. This is important when the data frame is very large.