Run multiple R-scripts simultaneously

2019-02-02 02:41发布

问题:

In my thesis I need to perform a lot of simulation studies, which all takes quite a while. My computer has 4 cores, so I have been wondering if it is possible to run for example two R-scripts in Rstudio at the same time, by letting them use two different cores? If this could be done, I could be saving a lot of time by just leaving the computer over night running all these scripts.

回答1:

Assuming that the results do not need to end up in the same environment, you can achieve this using RStudio projects: https://support.rstudio.com/hc/en-us/articles/200526207-Using-Projects

First create two separate projects. You can open both simultaneously, which will result in two rsessions. You can then open each script in each project, and execute each one separately. It is then on your OS to manage the core allocation.



回答2:

In RStudio

If you right click on RStudio, you should be able to open several separate "sessions" of RStudio (whether or not you use Projects). By default these will use 1 core each.

Update (July 2018): RStudio v1.2.830-1 which is available as a Preview Release supports a "jobs" pane. This is dedicated to running R scripts in the background separate from the interactive R session:

  • Run any R script as a background job in a clean R session

  • Monitor progress and see script output in real time

  • Optionally give jobs your global environment when started, and export values back when complete

This will be available in RStudio version 1.2.

Running Scripts in the Terminal

If have several scripts that you know run without errors, I'd recommend running these on different parameters through the command-line:

RCMD script.R
RScript script.R
R --vanilla < script.R

Running in the background:

nohup Rscript script.R &

Here "&" runs the script in the background (it can be retrieved with fg, monitored with htop, and killed with kill <pid> or pkill rsession) and nohup saves the output in a file and continues to run if the terminal is closed.

Passing arguments to a script:

Rscript script.R 1 2 3

This will pass c(1, 2, 3) to R as the output of commandArgs() so a loop in bash can run multiple instances of Rscript with a bash loop:

for ii in 1 2 3
  do
  nohup Rscript script.R $ii &
  done

Running parallel code within R

You will often find that a particular step in your R script is slowing computations, may I suggest running parallel code within your R code rather than running them separately? I'd recommend the snow package for running loops in parallel in R. Generally, instead of use:

cl <- makeCluster(n)
# n = number of cores (I'd recommend one less than machine capacity)
clusterExport(list=ls()) #export input data to all cores
output_list <- parLapply(cl, input_list, function(x) ... )
stopCluster() # close cluster when complete (particularly on shared machines)

Use this anywhere you would normally use a lapply function in R to run it in parallel.



回答3:

You can achieve multicore parallelism (as explained here https://cran.r-project.org/web/packages/doMC/vignettes/gettingstartedMC.pdf) in the same session with the following code

if(Sys.info()["sysname"]=="Windows"){
  library(doParallel)
  cl<-makeCluster(numberOfCores)
  registerDoParallel(cl)
}else{
  library(doMC)
  registerDoMC(numberOfCores)
}
library(foreach)

someList<-list("file1","file2")
returnComputation <-
  foreach(x=someList) %dopar%{
    source(x)
  }


if(Sys.info()["sysname"]=="Windows") stopCluster(cl)

You will need still adapt your output.



回答4:

If you want to do an embarrassing parallel, you can open as many as terminals you want in terminal tab (located just after Console tab) and run your code via using Rscript yourcode.R. Each code will be run on separated core by default. You can also use command line argument (as @Tom Kelly mentioned) if needed.



标签: r rstudio