[R] Bad optimization solution
S Ellison
S.Ellison at lgc.co.uk
Tue May 8 17:19:28 CEST 2007
Paul,
You have picked a function that is not smoothly differentiable and also started at one of many 'stationary' points in a system with multiple solutions. In practice, I think it'll get a zero gradient as the algorithm does things numerically and you have a symmetric function. It probably then chooses gradient-related step sizes of zero and goes nowhere, converging instantly. The same happens at (0.1,0.1) and anywhere else along x=y.
The problem affects pretty much all gradient-only algorithms handed stationary points in a symmetric function.
Solution? Ermm.. "don't do that with a gradient method", I suspect, though wiser heads may have more to say on the topic.
S
>>> "Paul Smith" <phhs80 at gmail.com> 07/05/2007 22:30:32 >>>
Dear All
I am trying to perform the below optimization problem, but getting
(0.5,0.5) as optimal solution, which is wrong; the correct solution
should be (1,0) or (0,1).
Am I doing something wrong? I am using R 2.5.0 on Fedora Core 6 (Linux).
Thanks in advance,
Paul
------------------------------------------------------
myfunc <- function(x) {
x1 <- x[1]
x2 <- x[2]
abs(x1-x2)
}
optim(c(0.5,0.5),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control=list(fnscale=-1))
______________________________________________
R-help at stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
*******************************************************************
This email and any attachments are confidential. Any use, co...{{dropped}}
More information about the R-help
mailing list