growing process size in simulation
Peter Dalgaard BSA
p.dalgaard@biostat.ku.dk
11 Oct 2002 21:31:21 +0200
Achim Zeileis <zeileis@ci.tuwien.ac.at> writes:
> I came across this in a simulation I ran under 1.6.0: If I do something
> like
>
> R> x <- rnorm(10)
> R> rval <- NULL
> R> for(i in 1:100000) rval <- t.test(x)$p.value
>
> then the process size remains at about 14M under 1.5.1, but it seems to
> be almost linearly growing up to more than 100M under 1.6.0.
>
> I know that the above simulation is nonsense, but it was the simplest I
> could come up with to reproduce the behaviour. It doesn't depend on
> t.test, if I use wilcox.test(x)$p.value the same happens...
Argh. Confirmed. One interesting clue is that R itself doesn't seem to
know about this:
for(i in 1:50000) {
rval <- t.test(x)$p.value
if (i %% 10000 == 0) print(gc())
}
used (Mb) gc trigger (Mb)
Ncells 208343 5.6 407500 10.9
Vcells 64656 0.5 786432 6.0
used (Mb) gc trigger (Mb)
Ncells 208343 5.6 407500 10.9
Vcells 64656 0.5 786432 6.0
used (Mb) gc trigger (Mb)
Ncells 208343 5.6 407500 10.9
Vcells 64656 0.5 786432 6.0
used (Mb) gc trigger (Mb)
Ncells 208343 5.6 407500 10.9
Vcells 64656 0.5 786432 6.0
used (Mb) gc trigger (Mb)
Ncells 208343 5.6 407500 10.9
Vcells 64656 0.5 786432 6.0
..but the memory footprint is still growing.
--
O__ ---- Peter Dalgaard Blegdamsvej 3
c/ /'_ --- Dept. of Biostatistics 2200 Cph. N
(*) \(*) -- University of Copenhagen Denmark Ph: (+45) 35327918
~~~~~~~~~~ - (p.dalgaard@biostat.ku.dk) FAX: (+45) 35327907
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-devel-request@stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._