[R] Memory issues..

JFRI (Jesper Frickmann) jfri at novozymes.com
Fri Nov 21 16:25:12 CET 2003

I just tried out the 1.8.1 beta build, and it works! It ran through all
17 assays without a any problems on Windows 2000.

Thanks to the R development team, they did a great job!

Kind regards, 
Jesper Frickmann 
Statistician, Quality Control 
Novozymes North America Inc. 
Tel. +1 919 494 3266
Fax +1 919 494 3460

-----Original Message-----
From: James MacDonald [mailto:jmacdon at med.umich.edu] 
Sent: Wednesday, November 12, 2003 1:09 PM
To: JFRI (Jesper Frickman); rodrigo.abt at sii.cl; tblackw at umich.edu
Cc: jmacdon at umich.edu
Subject: RE: [R] Memory issues..

There was a discussion about memory allocation on the R-devel list this
summer, and apparently somebody has done something about it in R-1.8.1
(according to BDR's earlier post). If you can compile yourself on
windows, you could check it out yourself.

Original post http://maths.newcastle.edu.au/~rking/R/devel/03b/0432.html
BDR's  reply http://maths.newcastle.edu.au/~rking/R/devel/03b/0433.html

BDR's recent comment
"Hopefully the memory management in R-devel will ease this, 
and you might like to compile that up and try it."



James W. MacDonald
Affymetrix and cDNA Microarray Core
University of Michigan Cancer Center
1500 E. Medical Center Drive
7410 CCGC
Ann Arbor MI 48109

>>> Rodrigo Abt <rodrigo.abt at sii.cl> 11/12/03 12:08PM >>>
I started R with --max-mem-size=300M and it "seems" to work better (at
least it doesn't hang up my machine), but I don't have any results yet.

P.S.: Are there any differences in memory management from 1.7.x to 1.8.0

Rodrigo Abt B.,
Statistical Analyst,
Department of Economic and Tributary Studies,
Studies Subdivision,
SII, Chile.

-----Mensaje original-----
De: Thomas W Blackwell [mailto:tblackw at umich.edu] 
Enviado el: Miercoles, 12 de Noviembre de 2003 12:43
Para: JFRI (Jesper Frickman)
CC: rodrigo.abt at sii.cl; jmacdon at umich.edu; r-help at stat.math.ethz.ch 
Asunto: RE: [R] Memory issues..

Jesper  -  (off-list)

Jim MacDonald reports seeing different memory-management behavior
between Windows and Linux operating systems on the same, dual boot
machine.  Unfortunately, this is happening at the operating system
level, so the R code cannot do anything about it.  I have cc'ed Jim on
this email, hoping that he will give more details to the entire list.
What operating systems (and versions of R) do you think Rodrigo and
Jesper are using ?

Specifically for Jesper's  AnalyzeAssay() function:  There is some
manipulation you can do using  formula()  or  as.formula()  that will
assign a local object as the environment in which to find values for the
terms in a formula.  (I've never done this, so I can't give you an
example of working code, only references to the help pages for "formula"
and "environment".  It's often very instructive to literally type in the
sequence of statements given as examples at the bottom of each help
page.)  I think this will allow you to avoid assigning to the global

Are you sure that the call to  rm() below is actually removing the copy
of limsdata that's in .GlobalEnv, rather than a local copy ? I would
expect you to have to specify  where=1  in order to get the behavior you

-  tom blackwell  -  u michigan medical school  -  ann arbor  -

On Wed, 12 Nov 2003, JFRI (Jesper Frickman) wrote:

> How much processing takes place before you get to the lme call? Maybe
> has just used up the memory on something else. I think there is a
> amount of memory leak, as I get similar problems with my program. I
> R 1.8.0. My program goes as follows.
> 1. Use RODBC to get a data.frame containing assays to analyze (17
> are found).
> 2. Define an AnalyzeAssay(assay, suffix) function to do the
> 	a) Use RODBC to get data.
> 	b) Store dataset "limsdata" in workspace using the <<- operator
> avoid the following error in qqnorm.lme: Error in eval(expr,
> enclos) : Object "limsdata" not found, when I call it with a
> formula like: ~ resid(.) | ORDCURV.
> 	c) Call lme to analyze data.
> 	d) Produce some diagnostic plots. Record them by setting
> on the trellis.device
> 	e) Save the plots on win.metafile using replayPlot(...)
> 	f) Save text to a file using sink(...)
> 3. Call the function for each assay using the code:
> # Analyze each assay
> for(i in 1:length(assays[,1]))
> {
> 	writeLines(paste("Analyzing ", assays$DILUTION[i], " ", 
> assays$PROFNO[i], "...", sep=""))
> 	flush.console()
> 	AnalyzeAssay(assays$DILUTION[i], assays$PROFNO[i])
> 	# Clean up memory
> 	rm(limsdata)
> 	gc()
> }
> As you can see, I try to remove the dataset stored in workspace and
> call gc() to clean up my memory as I go.
> Nevertheless, when I come to assay 11 out of 17, it stops with a
> allocation error. I have to quit R, and start again with assay 11,
> it stops again with assay 15 and finally 17. The last assays have
> more data than the first ones, but all assays can be completed as
> as I keep restarting...
> Maybe restarting the job can help you getting it done?
> Cheers,
> Jesper
> -----Original Message-----
> From: Rodrigo Abt [mailto:rodrigo.abt at sii.cl]
> Sent: Monday, November 10, 2003 11:02 AM
> To: r-help at stat.math.ethz.ch 
> Subject: [R] Memory issues..
> Hi dear R-listers, I'm trying to fit a 3-level model using lme in R.
> sample size is about 2965 and 3 factors:
> year (5 levels), ssize (4 levels), condition (2 levels).
> When I issue the following command:
> >
> thod
> ="ML")
> I got the following error:
> Error in logLik.lmeStructInt(lmeSt, lmePars) :
>         Calloc could not allocate (65230 of 8) memory
> In addition: Warning message:
> Reached total allocation of 120Mb: see help(memory.size)
> I'm currently using a Win2000 machine with 128Mb RAM and a 1.2 Gb 
> processor. My version of R is 1.7.1.
> Thanks in advance,
> Rodrigo Abt.
> Department of Economic and Tributary Studies,
> SII, Chile.
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list 
> https://www.stat.math.ethz.ch/mailman/listinfo/r-help

More information about the R-help mailing list