[R] Why does a 2 GB RData file exceed my 16GB memory limit when reading it in?

John jwd @end|ng |rom @urewe@t@net
Wed Sep 2 22:27:29 CEST 2020


On Wed, 2 Sep 2020 13:36:43 +0200
Uwe Ligges <ligges using statistik.tu-dortmund.de> wrote:

> On 02.09.2020 04:44, David Jones wrote:
> > I ran a number of analyses in R and saved the workspace, which
> > resulted in a 2GB .RData file. When I try to read the file back
> > into R  
> 
> Compressed in RData but uncompressed in main memory....
> 
> 
> > later, it won't read into R and provides the error: "Error: cannot
> > allocate vector of size 37 Kb"
> > 
> > This error comes after 1 minute of trying to read things in - I
> > presume a single vector sends it over the memory limit. But,
> > memory.limit() shows that I have access to a full 16gb of ram on my
> > machine (12 GB are free when I try to load the RData file).  
> 
> But the data may need more....
> 
> 
> > gc() shows the following after I receive this error:
> > 
> > used (Mb) gc trigger (Mb) max used (Mb)
> > Ncells 623130 33.3 4134347 220.8 5715387 305.3
> > Vcells 1535682 11.8 883084810 6737.5 2100594002 16026.3  
> 
> So 16GB were used when R gave up.
> 
> Best,
> Uwe Ligges

For my own part, looking at the OP's question, it does seem curious
that R could write that .RData file, but on the same system not be able
to reload something it created.  How would that work.  Wouldn't the
memory limit have been exceeded BEFORE the the .RData file was written
the FIRST time?

JDougherty



More information about the R-help mailing list