[R] Global speed ...
j.logsdon at lancaster.ac.uk
Fri Jun 4 07:42:28 CEST 1999
Un bon mot s'il vous plait.
I have coded an R routine to decompose messy data following Murray Aitkin
and Brian Francis's NPMLE GLIM macros for the normal distribution only but
extended to incorporate censoring and variance heterogeneity. Essentially
it is a wrapper for nlm() in the M-part while the E-part re-estimates the
weights in the same way as the GLIM macros.
The big problem is that it is slow to run - even with homogeneous variance
and no censoring it is substantially slow than the GLIM equivalent and
when I incorporate censoring, variance heterogeneity, a few factors in the
random part etc, the number of coefficients being estimated climbs very
quickly and it becomes an overnight job. No SG Origin to hand, I am
afraid, just good old Linux on an AMD 350 box.
Being a simple chap, and not wanting to get my hands dirty with C, I have
coded it using global variables (OK - back of the class) so that I can
restart it if it hasn't converged within 500 iterations and I can see all
the variables. Obviously the important part is the function call, which I
will speed up, but other than that, do global variables slow things up?
r-help mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !) To: r-help-request at stat.math.ethz.ch
More information about the R-help