Forcing R (and Rstudio) to use virtual memory on Windows

I work with large data sets, and quite often R raises an error saying that it cannot allocate a vector of this size or not enough memory.

My computer has 16 GB of RAM (Windows 10), and I work with data sets of about 4 GB, but some operations require a lot of memory, for example, converting a data set from a wide format to a long one. In some situations, I can use gc () to reprocess some memory, but many times it is not enough.

Sometimes I can split a data set into small pieces, but sometimes I need to work with the whole table at once.

I read that Linux users do not have this problem, but what about Windows?

I tried installing a large swap file on an SSD (200 GB), but I found that R does not use it at all.

I see the task manager and when the memory consumption reaches 16 GB of R-tables. The size of the swap file does not seem to matter.

How can I get R to use the page file? Do I need to compile it with some special flags?

PD: My experience is that deleting an rm () object and later using gc () does not restore all memory. Since I do operations with large data sets, my computer has less and less free memory at each step, regardless of whether I use gc ().

PD2: I expect that I will not hear trivial solutions such as "you need more RAM"

PD3: , Rstudio. R, . - , RStudio.

+4
1

RStudio, R_MAX_MEM_SIZE , , .Rprofile.

memory.limit(64000).

.Rprofile

invisible(utils::memory.limit(64000))

, .

, . .

, .

- , , , , .

+5

Source: https://habr.com/ru/post/1656822/


All Articles