Since this question a couple of months ago, I hope you find the answer. However, if you are still interested in feedback, here are some things to keep in mind:
When using foreach
with a parallel backend, you will not be able to assign variables in the global R environment in the form in which you are trying (you probably noticed this). Using a serial backend, the assignment will work, but not use parallel, as with doSNOW
.
Instead, save all the results of your calculations for each iteration in the list and return this to the object so that you can retrieve the corresponding results after all the calculations have been completed.
My suggestion starts similarly to your example:
library(doSNOW) MaxSearchSpace <- 44*5 cl <- makeCluster(parallel::detectCores())
After completing all iterations, extract the results from theRes
and save them as objects (e.g. globalVariable
, globalVariable2
, etc.)
globalVariable1 <- do.call(cbind, lapply(theRes, "[[", 1)) globalVariable2 <- do.call(cbind, lapply(theRes, "[[", 2))
With this in mind, if you perform calculations with each iteration, which depend on the results of calculations from previous iterations, then this type of parallel computing is not suitable for adoption.
source share