[R] saddle points in optim
Ben Bolker
bbolker at gmail.com
Sun Nov 7 16:21:11 CET 2010
Jonathan Phillips <994phij <at> gmail.com> writes:
> I've been trying to use optim to minimise least squares for a
> function, and then get a guess at the error using the hessian matrix
> (calculated from numDeriv::hessian, which I read in some other r-help
> post was meant to be more accurate than the hessian given in optim).
>
> To get the standard error estimates, I'm calculating
> sqrt(diag(solve(x))), hope that's correct.
>
> I've found that using numDeriv's hessian gets me some NaNs for errors,
> whereas the one from optim gets me numbers for all parameters. If I
> look for eigenvalues for numDeriv::hessian, I get two negative numbers
> (and six positive - I'm fitting to eight parameters), so does this
> mean that optim hasn't converged correctly, and has hit a saddle
> point? If so, is there any way I could assist it to find the minimum?
It's hard to say very much more without a reproducible example.
If you can invert the hessian, then you can see whether the trouble
is arising from particular parameters or parameter combinations.
For each 'difficult' parameter, you can try varying its value while
holding the others fixed at the point that optim() has found and see
whether the curvature is indeed negative in these directions.
You could try restarting from the current values, or trying other
optimization methods (possibly taking a look at the 'optimx' package
on R-forge), or using a stochastic optimization method such as SANN,
or somehow coming up with better starting parameters.
More information about the R-help
mailing list