[R] Memory issues..
Prof Brian Ripley
ripley at stats.ox.ac.uk
Thu Nov 13 17:12:12 CET 2003
On Thu, 13 Nov 2003, JFRI (Jesper Frickman) wrote:
> I tried first to increase --min-vsize to 2G (which I assume means as
> much of the 512M RAM available on my system as possible). The idea was
> to allocate all the heap memory in one huge chunk to avoid
> fragmentation.
But had you actually read the documentation you would know it did not do
that. That needs --max-memory-size set.
> It actually brought the number of assays completed up
> from 11 to 13 before it stopped with the usual error. Then I increased
> --max-memory-size to 2G, and when I came in this morning it was still
> running. However, it would probably take days instead of hours to
> complete the last couple of assays! So it is easier to restart a couple
> of times...
>
> Do you think that running R on Linux would fix the problem? I use Linux
> on my private home PC, and I might get a permission to try it out on the
> company network... If I have a good reason to do so!
We don't know what the problem is, and you haven't AFAICS compiled up
R-devel and tried that.
> Cheers,
> Jesper
>
> -----Original Message-----
> From: Prof Brian Ripley [mailto:ripley at stats.ox.ac.uk]
> Sent: Wednesday, November 12, 2003 10:55 AM
> To: JFRI (Jesper Frickman)
> Cc: r-help at stat.math.ethz.ch
> Subject: RE: [R] Memory issues..
>
>
> On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:
>
> > How much processing takes place before you get to the lme call? Maybe
> > R has just used up the memory on something else. I think there is a
> > fair amount of memory leak, as I get similar problems with my program.
>
> > I use
>
> Windows, right? I don't think this is memory leak, but rather
> fragmentation. Hopefully the memory management in R-devel will ease
> this,
> and you might like to compile that up and try it.
>
> On R 1.8.0 on Windows you have to be able to find a block of contiguous
> memory of the needed size, so fragmentation can kill you. Try
> increasing
> --max-memory-size unless you are near 2Gb.
>
> > R 1.8.0. My program goes as follows.
> >
> > 1. Use RODBC to get a data.frame containing assays to analyze (17
> > assays are found). 2. Define an AnalyzeAssay(assay, suffix) function
> > to do the following:
> > a) Use RODBC to get data.
> > b) Store dataset "limsdata" in workspace using the <<- operator
> to
> > avoid the following error in qqnorm.lme: Error in eval(expr, envir,
> > enclos) : Object "limsdata" not found, when I call it with a grouping
> > formula like: ~ resid(.) | ORDCURV.
> > c) Call lme to analyze data.
> > d) Produce some diagnostic plots. Record them by setting
> record=TRUE
> > on the trellis.device
> > e) Save the plots on win.metafile using replayPlot(...)
> > f) Save text to a file using sink(...)
> >
> > 3. Call the function for each assay using the code:
> >
> > # Analyze each assay
> > for(i in 1:length(assays[,1]))
> > {
> > writeLines(paste("Analyzing ", assays$DILUTION[i], " ",
> > assays$PROFNO[i], "...", sep=""))
> > flush.console()
> > AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
> >
> > # Clean up memory
> > rm(limsdata)
> > gc()
> > }
> >
> > As you can see, I try to remove the dataset stored in workspace and
> > then call gc() to clean up my memory as I go.
> >
> > Nevertheless, when I come to assay 11 out of 17, it stops with a
> > memory allocation error. I have to quit R, and start again with assay
> > 11, then it stops again with assay 15 and finally 17. The last assays
> > have much more data than the first ones, but all assays can be
> > completed as long as I keep restarting...
> >
> > Maybe restarting the job can help you getting it done?
>
>
--
Brian D. Ripley, ripley at stats.ox.ac.uk
Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
University of Oxford, Tel: +44 1865 272861 (self)
1 South Parks Road, +44 1865 272866 (PA)
Oxford OX1 3TG, UK Fax: +44 1865 272595
More information about the R-help
mailing list