[R] DUD (Does not Use Derivatives) for nonlinear
Prof J C Nash (U30A)
nashjc at uottawa.ca
Wed Apr 3 18:34:10 CEST 2013
> Date: Tue, 2 Apr 2013 06:59:13 -0500
> From: Paul Johnson <pauljohn32 at gmail.com>
> To: qi A <send2aqi at gmail.com>
> Cc: R-help <r-help at r-project.org>
> Subject: Re: [R] DUD (Does not Use Derivatives) for nonlinear
> regression in R?
> <CAErODj_1pK8raHyAme_2Wt5zQZ_HqOhRjQ62bChhkORWbW=o2A at mail.gmail.com>
> Content-Type: text/plain
> On Apr 1, 2013 1:10 AM, "qi A" <send2aqi at gmail.com> wrote:
>> Hi, All
>> SAS has DUD (Does not Use Derivatives)/Secant Method for nonlinear
>> regression, does R offer this option for nonlinear regression?
>> I have read the helpfile for nls() and could not find such option, any
> nelder-mead is default algorithm in optim. It does not use derivatives. dud
> is from same generation, but John Nash recommended N-M method.
>> [[alternative HTML version deleted]]
>> R-help at r-project.org mailing list
>> PLEASE do read the posting guide
>> and provide commented, minimal, self-contained, reproducible code.
> [[alternative HTML version deleted]]
I'm not sure where Paul is saying I recommended N-M, but I do think it
is important with optimization methods to recommend a method FOR some
particular class of problems or for a problem solving situation. A
blanket "this is good" recommendation cannot be made.
I chose NM (slightly BEFORE DUD was released) as the only derivative
free method in my 1979 book as it had the best balance of reliability
and performance for an 8K machine (code and data) that I was using in
1975. It still works well as a first-try method for optimization, but
generally is less efficient than gradient based methods, in particular
because it does not have a good way to know it is finished. As a
derivative-free method, it is "not too bad", particularly in the nmk
version in the dfoptim package. Indeed, I wish this version were put in
optim() as the default, since it can deal with bounds constraints,
though slightly less generally and less well than bobyqa or some other
methods, and there are a couple of minor details it handles better than
N-M in optim() that give it better performance and reliability.
Readers should notice that there are lots of conditional statements
above. It's a matter of selecting the right tool for the job. If you
have lots of compute power and don't mind wasting it, NM will likely get
somewhere near some or other optimum of your problem. It won't do it
terribly fast, and you'd better make sure you didn't just run out of
"iterations" or other measure that stops the program before it has
decided it is done. Also that the answer is the one you want. Most
optimization problems have more than one answer, and the "wrong" ones
often seem to be easier to find.
More information about the R-help