[BioC] segfault ReadAffy cause 'memory not mapped'
Brian D. Peyser PhD
brian.peyser at nih.gov
Thu Aug 22 00:56:58 CEST 2013
On Thu Aug 22 2013 6:18 PM, I wrote:
> I had considered it could be a limit of the signed int indices for R
> vectors/arrays, but I thought that had changed as of R v3.0. Also, I
> thought that would give the error 'too many elements specified' rather
> than a 'memory not mapped' segfault. I've certainly allocated close to
> 64 GiB to R doing other things with these data, I'm just not sure if any
> individual vectors were that large.
I just ran:
$ R
R version 3.0.1 (2013-05-16) -- "Good Sport"
Copyright (C) 2013 The R Foundation for Statistical Computing
Platform: x86_64-pc-linux-gnu (64-bit)
R is free software and comes with ABSOLUTELY NO WARRANTY.
You are welcome to redistribute it under certain conditions.
Type 'license()' or 'licence()' for distribution details.
Natural language support but running in an English locale
R is a collaborative project with many contributors.
Type 'contributors()' for more information and
'citation()' on how to cite R or R packages in publications.
Type 'demo()' for some demos, 'help()' for on-line help, or
'help.start()' for an HTML browser interface to help.
Type 'q()' to quit R.
> temp = array(rnorm(3750*604258), c(3750, 604258))
>
and R was allocated 30.0 GiB, and did not crash. From hgu133plus2probe
info, there are 604258 probes (_not_ probesets) on each hgu133plus2
Genechip, and I have 3750 chips. Therefore, I can generate a
3750-by-604258 array of random data without a segfault, and R shoots
right past 16 GiB allocated with no hiccups.
-Brian
--
Brian D. Peyser PhD
Special Assistant to the Associate Director
Office of the Associate Director
Developmental Therapeutics Program
Division of Cancer Treatment and Diagnosis
National Cancer Institute
National Institutes of Health
301-524-5587 (mobile)
More information about the Bioconductor
mailing list