[R] Error: cannot allocate vector of size 3.4 Gb
Peng Yu
pengyu.ut at gmail.com
Sat Nov 7 13:12:28 CET 2009
On Fri, Nov 6, 2009 at 8:19 PM, Benilton Carvalho <bcarvalh at jhsph.edu> wrote:
> this is converging to bioc.
>
> let me know what your sessionInfo() is and what type of CEL files you're
> trying to read, additionally provide exactly how you reproduce the problem.
Here is my sessionInfo(). pname is 'moex10stv1cdf'.
> for (f in list.celfiles('.',full.names=T,recursive=T)) {
+ print(f)
+ pname=cleancdfname(whatcdf(f))
+ print(pname)
+ }
> sessionInfo()
R version 2.9.2 (2009-08-24)
x86_64-unknown-linux-gnu
locale:
LC_CTYPE=en_US.UTF-8;LC_NUMERIC=C;LC_TIME=en_US.UTF-8;LC_COLLATE=en_US.UTF-8;LC_MONETARY=C;LC_MESSAGES=en_US.UTF-8;LC_PAPER=en_US.UTF-8;LC_NAME=C;LC_ADDRESS=C;LC_TELEPHONE=C;LC_MEASUREMENT=en_US.UTF-8;LC_IDENTIFICATION=C
attached base packages:
[1] stats graphics grDevices utils datasets methods base
other attached packages:
[1] pd.moex.1.0.st.v1_2.4.1 RSQLite_0.7-2 DBI_0.2-4
[4] oligo_1.8.3 preprocessCore_1.6.0 oligoClasses_1.6.0
[7] Biobase_2.4.1
loaded via a namespace (and not attached):
[1] affxparser_1.16.0 affyio_1.12.0 Biostrings_2.12.9 IRanges_1.2.3
[5] splines_2.9.2
> it appears to me, i'm not sure, that you start a fresh session of R and then
> tries to read in the data - how much resource do you have available when you
> try reading in the data? having 8GB RAM does not mean that you have 8GB when
> you tried the task.
>
> b
>
> On Nov 7, 2009, at 12:08 AM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <marc_schwartz at me.com>
>> wrote:
>>>
>>> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>>>
>>>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen
>>>> <chuck at sharpsteen.net>
>>>> wrote:
>>>>>
>>>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <pengyu.ut at gmail.com> wrote:
>>>>>>
>>>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
>>>>>> problem?
>>>>>
>>>>> Is it 32-bit R or 64-bit R?
>>>>>
>>>>> Are you running any other programs besides R?
>>>>>
>>>>> How far into your data processing does the error occur?
>>>>>
>>>>> The more statements you execute, the more "fragmented" R's available
>>>>> memory pool becomes. A 3.4 Gb chunk may no longer be available.
>>>>
>>>> I'm pretty sure it is 64-bit R. But I need to double check. What
>>>> command I should use to check?
>>>>
>>>> It seems that it didn't do anything but just read a lot of files
>>>> before it showed up the above errors.
>>>
>>>
>>> Check the output of:
>>>
>>> .Machine$sizeof.pointer
>>>
>>> If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit.
>>> See
>>> ?.Machine for more information.
>>
>> It is 8. The code that give the error is listed below. There are 70
>> celfiles. I'm wondering how to investigate what cause the problem and
>> fix it.
>>
>> library(oligo)
>> cel_files = list.celfiles('.', full.names=T,recursive=T)
>> data=read.celfiles(cel_files)
>>
>>> You can also check:
>>>
>>> R.version$arch
>>>
>>> and
>>>
>>> .Platform$r_arch
>>>
>>> which for 64 bit should show x86_64.
>>>
>>> HTH,
>>>
>>> Marc Schwartz
>>>
>>>
>>
>> ______________________________________________
>> R-help at r-project.org mailing list
>> https://stat.ethz.ch/mailman/listinfo/r-help
>> PLEASE do read the posting guide
>> http://www.R-project.org/posting-guide.html
>> and provide commented, minimal, self-contained, reproducible code.
>
>
More information about the R-help
mailing list