[R] saddle points in optim

Jonathan Phillips 994phij at gmail.com
Sat Nov 6 20:44:08 CET 2010

I've been trying to use optim to minimise least squares for a
function, and then get a guess at the error using the hessian matrix
(calculated from numDeriv::hessian, which I read in some other r-help
post was meant to be more accurate than the hessian given in optim).

To get the standard error estimates, I'm calculating
sqrt(diag(solve(x))), hope that's correct.

I've found that using numDeriv's hessian gets me some NaNs for errors,
whereas the one from optim gets me numbers for all parameters.  If I
look for eigenvalues for numDeriv::hessian, I get two negative numbers
(and six positive - I'm fitting to eight parameters), so does this
mean that optim hasn't converged correctly, and has hit a saddle
point?  If so, is there any way I could assist it to find the minimum?

Jon Phillips

More information about the R-help mailing list