[R] Dynamic Memory Allocation Errors in 1.2.0?
Kurt Hornik
Kurt.Hornik at ci.tuwien.ac.at
Mon Jan 1 14:25:48 CET 2001
>>>>> Jeff Lischer writes:
> Let me first say I'm new to R and new to this mail list, so I apologize
> ahead if this issue has already been discussed here.
> The dynamic memory allocation of 1.2.0 seems to be an improvement over the
> static allocation of 1.1.1. However, I have run across at least one case
> where the dynamic allocation might be corrupting my results. I am
> interested in calculating the 95% confidence margins of performance results
> based on the number of wins, draws, and losses in a match. For example, for
> a match with 10 wins, 20 draws, and 10 losses, the observed performance is
> 50% and the 95% margins are about (39.0%, 61.0%).
> I wrote the following simple file to calculate these margins using the ABC
> method from Resampling:
> # main function to calculate margins based on ABC method
> margins.abc <- function(W,D,L) {
> # Create arrays with correct number of wins, draws, and losses
> w <- rep(1.0,W)
> d <- rep(0.5,D)
> l <- rep(0.0,L)
> # Call abcnon with the mean as the function being used
> results.abc <<- abcnon(c(w,d,l),function(p,x) {sum(p*x)/sum(p)},
> alpha = c(0.025, 0.975))
> print("Actual Results")
> print(c(results.abc$stats$t0))
> print("Estimated 95% Margins")
> results.abc$limits
> }
> The 'abcnon' routine is from the 'bootstrap' package and it also calculates
> the margins based on standard theory so you can easily see the difference
> between standard and ABC margins. In R 1.1.1, my file always seems to work
> just fine. However, when I run the same file in 1.2.0 I get incorrect
> results when the number of games in the match is large.
> ** Score ** ************** 95 % Margins (%) *************
> W/D/L Standard 1.1.1 1.2.0
> 10/20/10 (39.0, 61.0) (39.0, 61.0) (39.0, 61.0)
> 100/200/100 (46.5, 53.5) (46.5, 53.5) (46.5, 53.5)
> 400/800/400 (48.3, 51.7) (48.3, 51.7) (45.8, 49.2)
> 800/1600/800 (48.8, 51.2) (48.8, 51.2) (52.8, 55.8)
> The 1.1.1 results are always the same as the standard results, which
> is correct for these symmetric cases. The 1.2.0 results, however, are
> meaningless for the largest two cases as they don't even enclose the
> observed result of 50%. My guess is that some critical data is getting
> written over by the dynamic memory allocation. I got the same results
> on Windows 98 and Windows NT with various settings of the
> min/max/vsize/nsize parameters. I also tried using the 'abc.ci'
> routine from the 'boot' package but got similar results for large
> matches.
> Has anyone else observed this behaviour in 1.2.0? Is there anything I
> can do differently to prevent it from happening? Until I figure this
> out, I will stay with 1.1.1 -- I don't want to be always worrying
> about the validity of the 1.2.0 results.
On Debian GNU/Linux, I get
R> margins.abc(400, 800, 400)
[1] "Actual Results"
[1] 0.5
[1] "Estimated 95% Margins"
alpha abc stan
[1,] 0.025 0.4826762 0.4826762
[2,] 0.975 0.5173238 0.5173238
R> margins.abc(800, 1600, 800)
[1] "Actual Results"
[1] 0.5
[1] "Estimated 95% Margins"
alpha abc stan
[1,] 0.025 0.4877502 0.4877502
[2,] 0.975 0.5122498 0.5122498
which seems right. Maybe this is only a problem on Windows?
-k
-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-.-
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
_._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._._
More information about the R-help
mailing list