I expect to generate a lot of data and then intercept their R. How can I estimate the size of data.frame (and therefore the required memory) by the number of rows, the number of columns and the types of variables?
Example.
If I have 10,000 rows and 150 columns, of which 120 are numeric, 20 are row and 10 are factor level, what size of data frame can I expect? Will the results change depending on the data stored in the columns (e.g. max(nchar(column)))?
> m <- matrix(1,nrow=1e5,ncol=150)
> m <- as.data.frame(m)
> object.size(m)
120009920 bytes
> a=object.size(m)/(nrow(m)*ncol(m))
> a
8.00066133333333 bytes
> m[,1:150] <- sapply(m[,1:150],as.character)
> b=object.size(m)/(nrow(m)*ncol(m))
> b
4.00098133333333 bytes
> m[,1:150] <- sapply(m[,1:150],as.factor)
> c=object.size(m)/(nrow(m)*ncol(m))
> c
4.00098133333333 bytes
> m <- matrix("ajayajay",nrow=1e5,ncol=150)
>
> m <- as.data.frame(m)
> object.size(m)
60047120 bytes
> d=object.size(m)/(nrow(m)*ncol(m))
> d
4.00314133333333 bytes
source
share