[R] R 3.5.0, vector memory exhausted error on readBin

iuke-tier@ey m@iii@g oii uiow@@edu iuke-tier@ey m@iii@g oii uiow@@edu
Tue Jun 12 11:26:37 CEST 2018


This item in NEWS explains the change:

     • The environment variable R_MAX_VSIZE can now be used to specify
       the maximal vector heap size. On macOS, unless specified by this
       environment variable, the maximal vector heap size is set to the
       maximum of 16GB and the available physical memory. This is to
       avoid having the R process killed when macOS over-commits memory.

You can set R_MAX_VSIZE to a larger value but you should do some
experimenting to decide on a safe value for your system. Mac OS is
quite good at using virtual memory up to a point but then gets very
bad. For my 4 GB mac numeric(8e9) works but numeric(9e9) causes R to
be killed, so a setting of around 60GB _might_ be safe.

File size probably doesn't matter in your example since you are
setting a large value for n - I can't tell how large since you didn't
provide your value of 'hertz'.

Best,

luke

On Mon, 11 Jun 2018, Valerie Cavett wrote:

> I???ve been reading in binary data collected via LabView for a project, and after upgrading to R 3.5.0, the code returns an error indicating that the 'vector memory is exhausted???.  I???m happy to provide a sample binary file; even ones that are quite small (12 MB) generate this error. (I wasn???t sure whether a binary file attached to this email would trigger a spam filter.)
>
> bin.read = file(files[i], "rb???)
> datavals = readBin(bin.read, integer(), size = 2, n = 8*hertz*60*60000, endian = "little???)
>
> Error: vector memory exhausted (limit reached?)
>
>
> sessionInfo()
> R version 3.5.0 (2018-04-23)
> Platform: x86_64-apple-darwin15.6.0 (64-bit)
> Running under: macOS Sierra 10.12.6
>
>
> This does not happen in R 3.4 (R version 3.4.4 (2018-03-15) -- "Someone to Lean On???) - the vector is created and populated by the binary file values without issue, even at a 1GB binary file size.
>
> Other files that are read in as csv files, even at 1GB, load correctly to 3.5, so I assume that this is a function of a vector being explicitly defined/changed in some way from 3.4 to 3.5.
>
> Any help, suggestions or workarounds are greatly appreciated!
> Val
>
> 	[[alternative HTML version deleted]]
>
>

-- 
Luke Tierney
Ralph E. Wareham Professor of Mathematical Sciences
University of Iowa                  Phone:             319-335-3386
Department of Statistics and        Fax:               319-335-3017
    Actuarial Science
241 Schaeffer Hall                  email:   luke-tierney using uiowa.edu
Iowa City, IA 52242                 WWW:  http://www.stat.uiowa.edu


More information about the R-help mailing list