The reserved memory R is twice the size of the allocated array

I noticed the following behavior. Let's say I create the following multidimensional array:

spam = array(runif(96*48*60*360), dim = c(96,48,60,360)) 

It is pretty predictable how much memory R should use for this, namely (96 * 48 * 60 * 360) * 4 bytes = 759.4 MB. This is well confirmed with the lsos function (see this post ):

 > lsos() Type Size PrettySize Rows Columns spam array 796262520 759.4 Mb 96 48 lsos function 776 776 bytes NA NA 

R as a process uses much more memory, about twice as much:

 $ top | grep rsession 82:17628 hiemstra 20 0 1614m **1.5g** 8996 S 0.3 40.4 0:04.85 rsession 

Why does R do this? I assume that additional reserved memory is allocated to make it more accessible to R? Any thoughts?

+6
source share
1 answer

Because the garbage collector is not running yet.
Thus, there is a lot of garbage probably created during the creation of a large array that needs to be cleaned up.

If you force garbage collection by calling the gc() function, you will see that the memory used will be pretty close to the size of your array:

 > memory.size() [1] 775.96 
+6
source

Source: https://habr.com/ru/post/921372/


All Articles