[R] Response to questions raised in Mar 17 reply
Jeff Yanosky
jyanosky at hsph.harvard.edu
Thu Jun 10 16:04:27 CEST 2004
Hi,
I would like to repsond to a few questions raised in a reply to a question
posted by another user on March 17, 2004. The entire message is copied at the
end of this email.
The relevant questions and statements are as follows:
What did you not understand about help(memory.size)?
This is also in the rw-FAQ: what in that did you not understand?
...
Yes, so try a machine with 2Gb RAM.
I have received the "Error: cannot allocate vector of size xxx Kb" many times
and have scoured the R help files (?memory.limit), the R help archive pages,
and the R-W FAQs. The following issues are what I specifically do not
understand about the help information available to address memory issues:
1) In ?memory.limit, it is stated that memory.size(max=TRUE) will return "the
maximum amount of memory obtained from the operating system". However, my
system returns only 1.57 GB of maximum memory, even though under System
Properties in Windows XP the operating system reports 2.5 GB on the system.
2)In ?memory.limit, it is stated that memory.limit(size=NA) will "report the
memory size", but is this the available memory, the total, or the amount in
use? If it is the total, how is it different from memory.size(max=TRUE)? On
my system the result of the three relevant memory functions are as follows:
> MemSizeinGB=(memory.limit(size=NA))/1E9
> MaxMemGB=(memory.size(max=T))/1E9
> MemInUseGB=(memory.size(max=F))/1E9
> MemSizeinGB
[1] 4.246733
> MaxMemGB
[1] 1.567990
> MemInUseGB
[1] 0.7674916
3)I have used the --max-mem-size option in the icon properties, shortcut tab,
target field after the pathname of the executable, and this appears to have
increased the result of memory.limit(size=NA). However, even with the
memory.limit set to 4.24 GB, I receive the "Error: cannot allocate vector of
135154 Kb" when trying to fit a GAM to some temperature data. How was the "2
GB" quantity determined in the reply below? Is there a rule of thumb for
determining how much memory will be needed for a specific vector size?
Thanks very much,
Jeff Yanosky
Harvard School of Public Health
On Wed, 17 Mar 2004, Matt Loveland wrote:
> I'm having trouble with glmmPQL.
I think you are having trouble with memory limits, actually. As the
author of glmmPQL, I don't appreciate my code being blamed for something
else.
> I'm fitting a 2 level random intercept model, with 90,000 cases and about
330 groups. I'm unable to get any results on the full data set. I can get it
to work if I sample down to about 30,000 cases. But for models with N's much
larger than that I get the following warning message:
>
>
m3=glmmPQL(prepfood~iage+iemployed+iwhite+ieduclevl+imarried+servcomm+leadgrup
+leadsty4, family=binomial, random=~1|congrega1,data=data)
> Error: cannot allocate vector of size 4135 Kb
> In addition: Warning message:
> Reached total allocation of 253Mb: see help(memory.size)
>
> I've tried increasing my virtual memory size, and also defragmenting my
> hard drive. It hasn't helped. I've seen other people asking similar
> questions on the archive, but it seems that this problem should have
> gone away after earlier versions of R, is that right?
Do read the page it asks you too. You are on Windows, and you need to use
the --max-mem-size flag when starting R to increase the memory available
to R. However, if you do swapping may make your machine nigh unusable.
What did you not understand about help(memory.size)?
This is also in the rw-FAQ: what in that did you not understand?
> Is this a data problem, am I fitting a bad model, or is it a memory size
> problem. I'm hoping the last one, and any help is appreciated.
Yes, so try a machine with 2Gb RAM.
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list