Using the ff package from R , I imported the csv file into the ffdf object, but was surprised to find that the object took up about 700 MB of RAM. Is it supposed to store data on disk, and not in RAM? Did I do something wrong? I am new to R. Any advice is welcome. Thanks.
> training.ffdf <- read.csv.ffdf(file="c:/temp/training.csv", header=T) >
Edit: I followed the advice of Tommy, missed the call to object.size, and looked at the Task Manager (I ran R on a Windows XP computer with 4 GB of RAM). I canceled the object, closed R, opened it and loaded the data from the file. The problem prevailed:
> library(ff); library(biglm) > # At this point RGui.exe had used up 26176 KB of memory > ffload(file="c:/temp/trainingffimg") > # Now 701160 KB > fit <- biglm(y ~ as.factor(x), data=training.ffdf) Error: cannot allocate vector of size 18.5 Mb
I also tried
> options("ffmaxbytes" = 402653184)
but after loading the data, RGui still used more than 700 MB of memory, and biglm regression still threw an error.
source share