[Rd] memory footprint of readRDS()
Brian G. Peterson
bri@n @ending from br@verock@com
Tue Sep 18 17:30:55 CEST 2018
Your RDS file is likely compressed, and could have compression of 10x
or more depending on the composition of the data that is in it and the
compression method used. 'gzip' compression is used by default.
--
Brian G. Peterson
http://braverock.com/brian/
Ph: 773-459-4973
IM: bgpbraverock
On Tue, 2018-09-18 at 17:28 +0200, Joris Meys wrote:
> Dear all,
>
> I tried to read in a 3.8Gb RDS file on a computer with 16Gb available
> memory. To my astonishment, the memory footprint of R rises quickly
> to over 13Gb and the attempt ends with an error that says "cannot
> allocate vector of size 5.8Gb".
>
> I would expect that 3 times the memory would be enough to read in
> that file, but apparently I was wrong. I checked the memory.limit()
> and that one gave me a value of more than 13Gb. So I wondered if this
> was to be expected, or if there could be an underlying reason why
> this file doesn't want to open.
>
> Thank you in advance
> Joris
>
More information about the R-devel
mailing list