[R] Windows Memory Issues
Pikounis, Bill
v_bill_pikounis at merck.com
Tue Dec 9 18:36:17 CET 2003
> [snipped] Or maybe some good links
> about memory and
> garbage collection.
As is mentioned time-to-time on this list when the above subject comes up,
Windows memory is a complicated topic. One open-source utility I have found
helpful to monitor memory when I work under XP is called RAMpage, authored
by John Fitzgibbon, and is available at
http://www.jfitz.com/software/RAMpage/
In its FAQ / Help, it touches on a lot of general memory and resource
issues, which I found helpful to learn about.
http://www.jfitz.com/software/RAMpage/RAMpage_FAQS.html
(Though the author clearly warns that its usefulness for "freeing memory"
may not be anymore than cosmetic on NT / 2000 / XP systems.)
Hope that helps.
Bill
----------------------------------------
Bill Pikounis, Ph.D.
Biometrics Research Department
Merck Research Laboratories
PO Box 2000, MailDrop RY33-300
126 E. Lincoln Avenue
Rahway, New Jersey 07065-0900
USA
Phone: 732 594 3913
Fax: 732 594 1565
> -----Original Message-----
> From: r-help-bounces at stat.math.ethz.ch
> [mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of
> Benjamin.STABLER at odot.state.or.us
> Sent: Tuesday, December 09, 2003 12:09 PM
> To: r-help at stat.math.ethz.ch
> Subject: Re: [R] Windows Memory Issues
>
>
> I would also like some clarification about R memory
> management. Like Doug,
> I didn't find anything about consecutive calls to gc() to
> free more memory.
> We run into memory limit problems every now and then and a better
> understanding of R's memory management would go a long way.
> I am interested
> in learning more and was wondering if there is any specific R
> documentation
> that explains R's memory usage? Or maybe some good links
> about memory and
> garbage collection. Thanks.
>
> Benjamin Stabler
> Transportation Planning Analysis Unit
> Oregon Department of Transportation
> 555 13th Street NE, Suite 2
> Salem, OR 97301 Ph: 503-986-4104
>
> -------------------------------------------
>
> Message: 21
> Date: Mon, 8 Dec 2003 09:51:12 -0800 (PST)
> From: Douglas Grove <dgrove at fhcrc.org>
> Subject: Re: [R] Windows Memory Issues
> To: Prof Brian Ripley <ripley at stats.ox.ac.uk>
> Cc: r-help at stat.math.ethz.ch
> Message-ID:
> <Pine.LNX.4.44.0312080921260.27288-100000 at echidna.fhcrc.org>
> Content-Type: TEXT/PLAIN; charset=US-ASCII
>
> On Sat, 6 Dec 2003, Prof Brian Ripley wrote:
>
> > I think you misunderstand how R uses memory. gc() does not
> free up all
> > the memory used for the objects it frees, and repeated
> calls will free
> > more. Don't speculate about how memory management works: do your
> > homework!
>
> Are you saying that consecutive calls to gc() will free more
> memory than
> a single call, or am I misunderstanding? Reading ?gc and
> ?Memory I don't
> see anything about this mentioned. Where should I be looking to find
> more comprehensive info on R's memory management?? I'm not
> writing any
> packages, just would like to have a better handle on efficiently using
> memory as it is usually the limiting factor with R. FYI, I'm running
> R1.8.1 and RedHat9 on a P4 with 2GB of RAM in case there is
> any platform
> specific info that may be applicable.
>
> Thanks,
>
> Doug Grove
> Statistical Research Associate
> Fred Hutchinson Cancer Research Center
>
>
> > In any case, you are using an outdated version of R, and your first
> > course of action should be to compile up R-devel and try
> that, as there
> > has been improvements to memory management under Windows.
> You could also
> > try compiling using the native malloc (and that *is*
> described in the
> > INSTALL file) as that has different compromises.
> >
> >
> > On Sat, 6 Dec 2003, Richard Pugh wrote:
> >
> > > Hi all,
> > >
> > > I am currently building an application based on R 1.7.1
> (+ compiled
> > > C/C++ code + MySql + VB). I am building this application
> to work on 2
> > > different platforms (Windows XP Professional (500mb
> memory) and Windows
> > > NT 4.0 with service pack 6 (1gb memory)). This is a very memory
> > > intensive application performing sophisticated operations
> on "large"
> > > matrices (typically 5000x1500 matrices).
> > >
> > > I have run into some issues regarding the way R handles
> its memory,
> > > especially on NT. In particular, R does not seem able to
> recollect some
> > > of the memory used following the creation and
> manipulation of large data
> > > objects. For example, I have a function which receives a (large)
> > > numeric matrix, matches against more data (maybe imported
> from MySql)
> > > and returns a large list structure for further analysis.
> A typical call
> > > may look like this .
> > >
> > > > myInputData <- matrix(sample(1:100, 7500000, T), nrow=5000)
> > > > myPortfolio <- createPortfolio(myInputData)
> > >
> > > It seems I can only repeat this code process 2/3 times
> before I have to
> > > restart R (to get the memory back). I use the same object names
> > > (myInputData and myPortfolio) each time, so I am not
> create more large
> > > objects ..
> > >
> > > I think the problems I have are illustrated with the
> following example
> > > from a small R session .
> > >
> > > > # Memory usage for Rui process = 19,800
> > > > testData <- matrix(rnorm(10000000), 1000) # Create big matrix
> > > > # Memory usage for Rgui process = 254,550k
> > > > rm(testData)
> > > > # Memory usage for Rgui process = 254,550k
> > > > gc()
> > > used (Mb) gc trigger (Mb)
> > > Ncells 369277 9.9 667722 17.9
> > > Vcells 87650 0.7 24286664 185.3
> > > > # Memory usage for Rgui process = 20,200k
> > >
> > > In the above code, R cannot recollect all memory used, so
> the memory
> > > usage increases from 19.8k to 20.2. However, the
> following example is
> > > more typical of the environments I use .
> > >
> > > > # Memory 128,100k
> > > > myTestData <- matrix(rnorm(10000000), 1000)
> > > > # Memory 357,272k
> > > > rm(myTestData)
> > > > # Memory 357,272k
> > > > gc()
> > > used (Mb) gc trigger (Mb)
> > > Ncells 478197 12.8 818163 21.9
> > > Vcells 9309525 71.1 31670210 241.7
> > > > # Memory 279,152k
> > >
> > > Here, the memory usage increases from 128.1k to 279.1k
> > >
> > > Could anyone point out what I could do to rectify this
> (if anything), or
> > > generally what strategy I could take to improve this?
> > >
> > > Many thanks,
> > > Rich.
> > >
> > > Mango Solutions
> > > Tel : (01628) 418134
> > > Mob : (07967) 808091
> > >
> > >
> > > [[alternative HTML version deleted]]
> > >
> > > ______________________________________________
> > > R-help at stat.math.ethz.ch mailing list
> > > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> > >
> > >
> >
> > --
> > Brian D. Ripley, ripley at stats.ox.ac.uk
> > Professor of Applied Statistics, http://www.stats.ox.ac.uk/~ripley/
> > University of Oxford, Tel: +44 1865 272861 (self)
> > 1 South Parks Road, +44 1865 272866 (PA)
> > Oxford OX1 3TG, UK Fax: +44 1865 272595
> >
> > ______________________________________________
> > R-help at stat.math.ethz.ch mailing list
> > https://www.stat.math.ethz.ch/mailman/listinfo/r-help
> >
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help
>
>
More information about the R-help
mailing list