[Rd] Seeking opinions on possible change to nls() code
J C Nash
pro|jcn@@h @end|ng |rom gm@||@com
Fri Aug 20 17:41:26 CEST 2021
Thanks Martin. I'd missed the intention of that option, but re-reading
it now it is obvious.
FWIW, this problem is quite nasty, and so far I've found no method
that reveals the underlying dangers well. And one of the issues with
nonlinear models is that they reveal how slippery the concept of
inference can be when applied to parameters in such models.
JN
On 2021-08-20 11:35 a.m., Martin Maechler wrote:
>>>>>> J C Nash
>>>>>> on Fri, 20 Aug 2021 11:06:25 -0400 writes:
>
> > In our work on a Google Summer of Code project
> > "Improvements to nls()", the code has proved sufficiently
> > entangled that we have found (so far!) few
> > straightforward changes that would not break legacy
> > behaviour. One issue that might be fixable is that nls()
> > returns no result if it encounters some computational
> > blockage AFTER it has already found a much better "fit"
> > i.e. set of parameters with smaller sum of squares. Here
> > is a version of the Tetra example:
>
> time=c( 1, 2, 3, 4, 6 , 8, 10, 12, 16)
> conc = c( 0.7, 1.2, 1.4, 1.4, 1.1, 0.8, 0.6, 0.5, 0.3)
> NLSdata <- data.frame(time,conc)
> NLSstart <-c(lrc1=-2,lrc2=0.25,A1=150,A2=50) # a starting vector (named!)
> NLSformula <-conc ~ A1*exp(-exp(lrc1)*time)+A2*exp(-exp(lrc2)*time)
> tryit <- try(nls(NLSformula, data=NLSdata, start=NLSstart, trace=TRUE))
> print(tryit)
>
> > If you run this, tryit does not give information that the
> > sum of squares has been reduced from > 60000 to < 2, as
> > the trace shows.
>
> > Should we propose that this be changed so the returned
> > object gives the best fit so far, albeit with some form of
> > message or return code to indicate that this is not
> > necessarily a conventional solution? Our concern is that
> > some examples might need to be adjusted slightly, or we
> > might simply add the "try-error" class to the output
> > information in such cases.
>
> > Comments are welcome, as this is as much an infrastructure
> > matter as a computational one.
>
> Hmm... many years ago, we had introduced the 'warnOnly=TRUE'
> option to nls() i.e., nls.control() exactly for such cases,
> where people would still like to see the solution:
>
> So,
>
> ------------------------------------------------------------------------------
>> try2 <- nls(NLSformula, data=NLSdata, start=NLSstart, trace=TRUE,
> control = nls.control(warnOnly=TRUE))
> 61215.76 (3.56e+03): par = (-2 0.25 150 50)
> 2.175672 (2.23e+01): par = (-1.9991 0.3171134 2.618224 -1.366768)
> 1.621050 (7.14e+00): par = (-1.960475 -2.620293 2.575261 -0.5559918)
> Warning message:
> In nls(NLSformula, data = NLSdata, start = NLSstart, trace = TRUE, :
> singular gradient
>
>> try2
> Nonlinear regression model
> model: conc ~ A1 * exp(-exp(lrc1) * time) + A2 * exp(-exp(lrc2) * time)
> data: NLSdata
> lrc1 lrc2 A1 A2
> -22.89 96.43 156.70 -156.68
> residual sum-of-squares: 218483
>
> Number of iterations till stop: 2
> Achieved convergence tolerance: 7.138
> Reason stopped: singular gradient
>
>> coef(try2)
> lrc1 lrc2 A1 A2
> -22.88540 96.42686 156.69547 -156.68461
>
>
>> summary(try2)
> Error in chol2inv(object$m$Rmat()) :
> element (3, 3) is zero, so the inverse cannot be computed
>>
> ------------------------------------------------------------------------------
>
> and similar for vcov(), of course, where the above error
> originates.
>
> { I think GSoC (andr other) students should start by studying and
> exploring relevant help pages before drawing conclusions
> ......
> but yes, I've been born in the last millennium ...
> }
>
> ;-)
>
> Have a nice weekend!
> Martin
>
More information about the R-devel
mailing list