[R] Dynamic Memory Allocation Errors in 1.2.0?
Jeff_Lischer@mksinst.com
Jeff_Lischer at mksinst.com
Thu Dec 28 18:18:19 CET 2000
Let me first say I'm new to R and new to this mail list, so I apologize
ahead if this issue has already been discussed here.
The dynamic memory allocation of 1.2.0 seems to be an improvement over the
static allocation of 1.1.1. However, I have run across at least one case
where the dynamic allocation might be corrupting my results. I am
interested in calculating the 95% confidence margins of performance results
based on the number of wins, draws, and losses in a match. For example, for
a match with 10 wins, 20 draws, and 10 losses, the observed performance is
50% and the 95% margins are about (39.0%, 61.0%).
I wrote the following simple file to calculate these margins using the ABC
method from Resampling:
# main function to calculate margins based on ABC method
margins.abc <- function(W,D,L) {
# Create arrays with correct number of wins, draws, and losses
w <- rep(1.0,W)
d <- rep(0.5,D)
l <- rep(0.0,L)
# Call abcnon with the mean as the function being used
results.abc <<- abcnon(c(w,d,l),function(p,x) {sum(p*x)/sum(p)},
alpha = c(0.025, 0.975))
print("Actual Results")
print(c(results.abc$stats$t0))
print("Estimated 95% Margins")
results.abc$limits
}
The 'abcnon' routine is from the 'bootstrap' package and it also calculates
the margins based on standard theory so you can easily see the difference
between standard and ABC margins. In R 1.1.1, my file always seems to work
just fine. However, when I run the same file in 1.2.0 I get incorrect
results when the number of games in the match is large.
** Score ** ************** 95 % Margins (%) *************
W/D/L Standard 1.1.1 1.2.0
10/20/10 (39.0, 61.0) (39.0, 61.0) (39.0, 61.0)
100/200/100 (46.5, 53.5) (46.5, 53.5) (46.5, 53.5)
400/800/400 (48.3, 51.7) (48.3, 51.7) (45.8, 49.2)
800/1600/800 (48.8, 51.2) (48.8, 51.2) (52.8, 55.8)
The 1.1.1 results are always the same as the standard results, which is
correct for these symmetric cases. The 1.2.0 results, however, are
meaningless for the largest two cases as they don't even enclose the
observed result of 50%. My guess is that some critical data is getting
written over by the dynamic memory allocation. I got the same results on
Windows 98 and Windows NT with various settings of the min/max/vsize/nsize
parameters. I also tried using the 'abc.ci' routine from the 'boot' package
but got similar results for large matches.
Has anyone else observed this behaviour in 1.2.0? Is there anything I can
do differently to prevent it from happening? Until I figure this out, I
will stay with 1.1.1 -- I don't want to be always worrying about the
validity of the 1.2.0 results.
______________________________
D. Jeffrey Lischer, Ph.D.
Principal Mechanical Engineer
MKS Instruments, Inc.
E-Mail: Jeff_Lischer at mksinst.com
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list