I'm trying to parallelize (using snow::parLapply
) some code that depends on a package (ie, a package other than snow
). Objects referenced in the function called by parLapply
must be explicitly passed to the cluster using clusterExport
. Is there any way to pass an entire package to the cluster rather than having to explicitly name every function (including a package's internal functions called by user functions!) in clusterExport
?
相关问题
- R - Quantstart: Testing Strategy on Multiple Equit
- Using predict with svyglm
- Reshape matrix by rows
- Extract P-Values from Dunnett Test into a Table by
- split data frame into two by column value [duplica
相关文章
- How to convert summary output to a data frame?
- How to plot smoother curves in R
- Paste all possible diagonals of an n*n matrix or d
- ess-rdired: I get this error “no ESS process is as
- How to use doMC under Windows or alternative paral
- dyLimit for limited time in Dygraphs
- Saving state of Shiny app to be restored later
- How to insert pictures into each individual bar in
Install the package on all nodes, and have your code call
library(thePackageYouUse)
on all nodes via one the available commands, egg something likeI think the
parallel
package which comes with recent R releases has examples -- see for example here fromhelp(clusterApply)
where theboot
package is loaded everywhere: