I have the following code to load some data in my .Rprofile (which is an R script in my project folder that starts automatically when I switch to the project using Rstudio).
data_files <- list.files(pattern="\\.(RData|rda)$") if("data.rda" %in% data_files) { attach(what="data.rda", pos = 2) cat("The file 'data.rda' was attached to the search path under 'file:data.rda'.\n\n") }
The downloads are relatively large:
Type Size PrettySize Rows Columns individual_viewings_26 data.frame 1547911120 [1] "1.4 Gb" 3685312 63 viewing_statements_all data.table 892316088 [1] "851 Mb" 3431935 38 weights data.frame 373135464 [1] "355.8 Mb" 3331538 14 pet data.table 63926168 [1] "61 Mb" 227384 34
But I have 16 GB, and I can allocate them:
> memory.limit() [1] 16289
When my data was not so big, I had no problems. I recently saved some data in data.rda, and my R session suddenly failed at startup (when I switch to a project in Rstudio and .Rprofile is executed):
Error: cannot allocate vector of size 26.2 Mb In addition: Warning messages: 1: Reached total allocation of 2047Mb: see help(memory.size) 2: Reached total allocation of 2047Mb: see help(memory.size) 3: Reached total allocation of 2047Mb: see help(memory.size) 4: Reached total allocation of 2047Mb: see help(memory.size)
I suspect that for some reason the memory limit is set to 2 GB at boot time? How can i change this?
Edit: added OS and software version
> sessionInfo() R version 3.2.2 (2015-08-14) Platform: x86_64-w64-mingw32/x64 (64-bit) Running under: Windows 7 x64 (build 7601) Service Pack 1
Edit2: just to clarify, I can load the data myself by running the code, I have a lot of available memory, and the R process usually uses up to 10 GB during my daily work. The problem is that there seems to be a 2 GB memory limit when R loads and executes .Rprofile ...