I spend some time on Google looking for a queue / download balancing package for R.
What I'm trying to achieve:
- performing several functions independent of eachother R from a remote interface.
- using two dual-core servers as a backup of R
Knowing that:
- each function is usually processed after 10-30 seconds.
- every 5 minutes on average (but it can be at the same time) a set of 8-15 functions that must be performed is sent to the backend (in the processing queue: first in the first order). On average 5 minutes, multiple sets can be sent simultaneously.
- the 2x2 R instance will already be running, with the downloaded packages they will always be the same, so there is no need to reload them all the time
- the amount of data transmitted is very low: 50k max
There is no subject of code parallelism (snow, snowfall, condor and other traditional cluster solutions)
Do you know a good package / tool designed for R that might help?
Thanks a lot!
source share