[R] Bad optimization solution
Ravi Varadhan
rvaradhan at jhmi.edu
Tue May 8 01:10:43 CEST 2007
Your function, (x1-x2)^2, has zero gradient at all the starting values such
that x1 = x2, which means that the gradient-based search methods will
terminate there because they have found a critical point, i.e. a point at
which the gradient is zero (which can be a maximum or a minimum or a saddle
point).
However, I do not why optim converges to the boundary maximum, when analytic
gradient is supplied (as shown by Sundar).
Ravi.
----------------------------------------------------------------------------
-------
Ravi Varadhan, Ph.D.
Assistant Professor, The Center on Aging and Health
Division of Geriatric Medicine and Gerontology
Johns Hopkins University
Ph: (410) 502-2619
Fax: (410) 614-9625
Email: rvaradhan at jhmi.edu
Webpage: http://www.jhsph.edu/agingandhealth/People/Faculty/Varadhan.html
----------------------------------------------------------------------------
--------
-----Original Message-----
From: r-help-bounces at stat.math.ethz.ch
[mailto:r-help-bounces at stat.math.ethz.ch] On Behalf Of Paul Smith
Sent: Monday, May 07, 2007 6:26 PM
To: R-help
Subject: Re: [R] Bad optimization solution
On 5/7/07, Paul Smith <phhs80 at gmail.com> wrote:
> > I think the problem is the starting point. I do not remember the
details
> > of the BFGS method, but I am almost sure the (.5, .5) starting point is
> > suspect, since the abs function is not differentiable at 0. If you
perturb
> > the starting point even slightly you will have no problem.
> >
> > "Paul Smith"
> > <phhs80 at gmail.com
> > >
To
> > Sent by: R-help <r-help at stat.math.ethz.ch>
> > r-help-bounces at st
cc
> > at.math.ethz.ch
> >
Subject
> > [R] Bad optimization solution
> > 05/07/2007 04:30
> > PM
> >
> >
> >
> >
> >
> >
> >
> >
> > Dear All
> >
> > I am trying to perform the below optimization problem, but getting
> > (0.5,0.5) as optimal solution, which is wrong; the correct solution
> > should be (1,0) or (0,1).
> >
> > Am I doing something wrong? I am using R 2.5.0 on Fedora Core 6 (Linux).
> >
> > Thanks in advance,
> >
> > Paul
> >
> > ------------------------------------------------------
> > myfunc <- function(x) {
> > x1 <- x[1]
> > x2 <- x[2]
> > abs(x1-x2)
> > }
> >
> >
optim(c(0.5,0.5),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control=
list(fnscale=-1))
>
> Yes, with (0.2,0.9), a correct solution comes out. However, how can
> one be sure in general that the solution obtained by optim is correct?
> In ?optim says:
>
> Method '"L-BFGS-B"' is that of Byrd _et. al._ (1995) which allows
> _box constraints_, that is each variable can be given a lower
> and/or upper bound. The initial value must satisfy the
> constraints. This uses a limited-memory modification of the BFGS
> quasi-Newton method. If non-trivial bounds are supplied, this
> method will be selected, with a warning.
>
> which only demands that "the initial value must satisfy the constraints".
Furthermore, X^2 is everywhere differentiable and notwithstanding the
reported problem occurs with
myfunc <- function(x) {
x1 <- x[1]
x2 <- x[2]
(x1-x2)^2
}
optim(c(0.2,0.2),myfunc,lower=c(0,0),upper=c(1,1),method="L-BFGS-B",control=
list(fnscale=-1))
Paul
______________________________________________
R-help at stat.math.ethz.ch mailing list
https://stat.ethz.ch/mailman/listinfo/r-help
PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list