[R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

David Jones d@v|d@tn@jone@ @end|ng |rom gm@||@com
Wed Sep 2 04:44:51 CEST 2020


I ran a number of analyses in R and saved the workspace, which
resulted in a 2GB .RData file. When I try to read the file back into R
later, it won't read into R and provides the error: "Error: cannot
allocate vector of size 37 Kb"

This error comes after 1 minute of trying to read things in - I
presume a single vector sends it over the memory limit. But,
memory.limit() shows that I have access to a full 16gb of ram on my
machine (12 GB are free when I try to load the RData file).

gc() shows the following after I receive this error:

used (Mb) gc trigger (Mb) max used (Mb)
Ncells 623130 33.3 4134347 220.8 5715387 305.3
Vcells 1535682 11.8 883084810 6737.5 2100594002 16026.3



More information about the R-help mailing list