I have an R application interacting with Java daemons via stdin
and stdout
in an infinite loop that seems to have some memory leaks. Simplified R Application:
while (TRUE) { con <- file('stdin', open = 'r', blocking = TRUE) line <- scan(con, what = character(0), nlines = 1, quiet = TRUE) close(con) }
This cycle ends up using more and more RAM, and even if I manually gc()
after calling close(con)
, the amount of memory seems to be short-lived, but ultimately grows forever.
The main script to confirm this:
Rscript --vanilla -e "while(TRUE)cat(runif(1),'\n')" | Rscript --vanilla -e "cat(Sys.getpid(), '\n');while (TRUE) {con <- file('stdin', open = 'r', blocking = TRUE);line <- scan(con, what = character(0), nlines = 1, quiet = TRUE);close(con);gc()}"
This will start two R processes: one record on stdout
, and the other record from stdin
linked to the pipe (and the second pid
print so you can track the usage of the linked memory):

I'm not sure what I'm doing wrong, but would like to stop this memory leak, so any help would be greatly appreciated.
source share