[R-SIG-Mac] Memory error
Alan Olav Bergland
Alan_Bergland at brown.edu
Mon Nov 28 14:27:27 CET 2005
The data set is fairly big.
Again, the model I was trying to run was:
lme(ovn~tlc+geno+log(food), clinal, random=~tlc+geno+log(food)|block/
lat)
where,
"ovn" is a continuous, normal (length=999)
"tlc" is continuous, normal, definitely random
"food" could either be a factor with 6 levels or a numerical variable
(making it log(food) makes a more even distribution). Even though it
is an experimental treatment, I would like to treat it as random
because I am interested in the slope and its interaction with the
"geno" variable (although, note, there are no interactions in the
model above, as putting them in at this point would cause even more
problems)
"geno" is genotype, 12 total, 4 from each "lat." "geno" should
really be treated as random too, because they are random draws from
all genotypes in a population
"lat" is latitude, 3 total. Right now I'm treating it as fixed.
"block" is equivalent to replicate, 2 total.
The experimental design was the following:
12 genotypes, from 3 latitudes, reared under 6 conditions, replicated
twice with ~15-20 individuals measured from each geno x food x block
interaction. Because individuals from each g x f x b were reared in
the same rearing chamber, they are not independent thus some grouping
structure is necessary.
I appreciate your curiosity on this problem. I know, however, that
this forum is not the place to hack out statistical questions, but
this memory thing seemed specific to Macs.
Cheers,
Alan
On Nov 28, 2005, at 8:01 AM, stefano iacus wrote:
> I don't think this is necessarily a user problem of allocating big
> chunks of memory, or at least I did experienced last days the same
> kind of issue by iterating some linear optimization problem of very
> small dimension (something like 5 rows x 5 columns matrix).
> In my particular problem it was a sequence on independent calls to
> lpSolve on very small problems. Looking at "top" command on the
> shell, I've seen a growing number of vmem (VSIZE?) almost linearly
> with iteration up to 3.5 giga.
> This is also occurring after forcing gc(). If i stop the iteration
> before the alloc failure, the previously allocated ram never gets
> back until I quit R itself.
> I'm not able to debug myself, but I can try to provide a
> reproducible example (not these days, but later on)
> I've faced the same problem on a G4 and a dual G5.
>
> I'm curious to see the dimension of the data Alan is using, but I'm
> confident that this is not where the problem lies.
>
> stefano
>
>
> On Nov 28, 2005, at 1:31 AM, Simon Urbanek wrote:
>
>>
>> On Nov 27, 2005, at 4:37 PM, Alan Olav Bergland wrote:
>>
>>> When I attempt to run a rather hefty lme model, I get the following
>>> error message:
>>>
>>>> clinal7.lme<-lme(ovn~tlc+geno+log(food), clinal, random=~tlc+geno
>>> +log(food)|block/lat)
>>> Error in logLik.lmeStructInt(lmeSt, lmePars) :
>>> Calloc could not allocate (500237956 of 8) memory
>>> R(1049,0xa000ed68) malloc: *** vm_allocate(size=4001906688) failed
>>> (error code=3)
>>> R(1049,0xa000ed68) malloc: *** error: can't allocate region
>>> R(1049,0xa000ed68) malloc: *** set a breakpoint in szone_error to
>>> debug
>>>
>>>
>>> I'm running R Version 2.2.0 (2005-10-06 r35749) on a iMac running
>>> OS 10.4.3
>>>
>>>
>>> Any suggestions?
>>
>> Reformulate your problem.
>> The lme you run with that data attempts to allocate 4GB of memory
>> which is a bit too much. Even if you put 8GB in a G5 and run 64-bit
>> version of R it's likely to run out of memory or to take forever ...
>> Maybe someone will be able to help you to solve your problem in a
>> different way if you specify more precisely what you are trying to do
>> (including size of the data etc.).
>>
>> Cheers,
>> Simon
>>
>> _______________________________________________
>> R-SIG-Mac mailing list
>> R-SIG-Mac at stat.math.ethz.ch
>> https://stat.ethz.ch/mailman/listinfo/r-sig-mac
>>
>
More information about the R-SIG-Mac
mailing list