[R] Error: cannot allocate vector of size 3.4 Gb
Peng Yu
pengyu.ut at gmail.com
Sat Nov 7 03:08:44 CET 2009
On Fri, Nov 6, 2009 at 5:00 PM, Marc Schwartz <marc_schwartz at me.com> wrote:
> On Nov 6, 2009, at 4:19 PM, Peng Yu wrote:
>
>> On Fri, Nov 6, 2009 at 3:39 PM, Charlie Sharpsteen <chuck at sharpsteen.net>
>> wrote:
>>>
>>> On Fri, Nov 6, 2009 at 1:30 PM, Peng Yu <pengyu.ut at gmail.com> wrote:
>>>>
>>>> I run R on a linux machine that has 8GB memory. But R gives me an
>>>> error "Error: cannot allocate vector of size 3.4 Gb". I'm wondering
>>>> why it can not allocate 3.4 Gb on a 8GB memory machine. How to fix the
>>>> problem?
>>>
>>> Is it 32-bit R or 64-bit R?
>>>
>>> Are you running any other programs besides R?
>>>
>>> How far into your data processing does the error occur?
>>>
>>> The more statements you execute, the more "fragmented" R's available
>>> memory pool becomes. A 3.4 Gb chunk may no longer be available.
>>
>> I'm pretty sure it is 64-bit R. But I need to double check. What
>> command I should use to check?
>>
>> It seems that it didn't do anything but just read a lot of files
>> before it showed up the above errors.
>
>
> Check the output of:
>
> .Machine$sizeof.pointer
>
> If it is 4, R was built as 32 bit, if it is 8, R was built as 64 bit. See
> ?.Machine for more information.
It is 8. The code that give the error is listed below. There are 70
celfiles. I'm wondering how to investigate what cause the problem and
fix it.
library(oligo)
cel_files = list.celfiles('.', full.names=T,recursive=T)
data=read.celfiles(cel_files)
> You can also check:
>
> R.version$arch
>
> and
>
> .Platform$r_arch
>
> which for 64 bit should show x86_64.
>
> HTH,
>
> Marc Schwartz
>
>
More information about the R-help
mailing list