R-beta: SEs for one-param MLE in R?

Martin Maechler Martin Maechler <maechler@stat.math.ethz.ch>
Tue, 21 Apr 1998 11:51:28 +0200

	[--- diverting from  R-help to R-devel ... MM ---]

>>>>> "Jim" == Jim Lindsey <jlindsey@luc.ac.be> writes:

    Jim> Just back from a week camping in the snow in the English Lakes,
    Jim> and trying to catch up...

I hope the adventurous vacation was also relaxing in some ways..

    MM> ....
    MM> BTW: I am (we are) interested in the functions that you are writing
    MM> for nlm(.)  It certainly is worthwhile to have nlm(.) return a class
    MM> "nlm" result and provide print.nlm(.) and summary.nlm(.) functions
    MM> {{ Jim Lindsey already posted something like this, unfortunately
    MM> using "nls" which we don't want as long as it is not very close to
    MM> S' nls(.) function }}

    Jim> I am afraid that I don't understand the logic of this requirement
    Jim> of closeness to R for such functions. nlm() itself is not close
    Jim> and even hist() has never been very similar.

Yes, there are some cases where we (the R-core team, or the "R-devel" group)
have decided that S-plus is so wrong that we don't want to emulate or stay
close to it.   hist(.) is one such example.

With  nls(.), this is quite different, I think.
nls(.) has several nice features, (your "nls" did too!).
Most notably, the calling syntax of nls(.) using model notation,
is something I would want before I'd call a function "nls".

At the moment, I think it would be better to add an "nlm" class and
corresponding print and summary methods to  nlm(.), rather than calling
such a thing "nls".

Also, a few weeks ago, Ross said that he planned to add code to nlm
which would make use of  gradient and hessian function when provided
(or also, using D(.) ?).

    Jim> On the other hand,
    Jim> remember that the nls() I sent was a very cut-down version of one
    Jim> in one of my libraries. It had the above solution for the
    Jim> inversion with one dimensional parameters. By the way, the
    Jim> original Fortran for nlm that I ported to R printed out a warning
    Jim> that the algorithm is very inefficient for one-dimensional
    Jim> problems.
Yes, your "nls" function was useful!
and I agree that "nlm" should be pushed in the direction of what you had.

    Jim>   With respect to Bill's negative binomial that started another
    Jim> discussion, one of the functions in my nonlinear regression
    Jim> library does negative binomial nonlinear regression for both the
    Jim> mean and the dispersion parameters, along with twenty odd other
    Jim> distributions. I used it for beta-binomial regression in my paper
    Jim> with Pat Altham in the latest Applied Statistics. (Also another
    Jim> similar function for a finite mixture with these distributions,
    Jim> for example for negative binomial with inflated zeroes.) In my
    Jim> repeated measures library, there is a similar function for
    Jim> regressions with the same collection of distributions, but having
    Jim> a random intercept. Once R stabilizes...

well, isn't  R  stabilizing more and more?
Since 0.61, at least the way extension packages (formerly known as "libraries")
should be written is pretty stable.
Jim, I think several people really would be interested in your 
``nonlinear regression library'' and the ``repeated measures library''.

If they don't work in R 0.61, they could be at least put into 

Best regards!
r-devel mailing list -- Read http://www.ci.tuwien.ac.at/~hornik/R/R-FAQ.html
Send "info", "help", or "[un]subscribe"
(in the "body", not the subject !)  To: r-devel-request@stat.math.ethz.ch