[R] optim()
Spencer Graves
spencer.graves at pdf.com
Sat Jul 22 07:54:56 CEST 2006
1. I'd be worried any time the answer depended very much on the
starting values. It suggests that the objective function is not well
behaved, and I must be very careful to make sure I get an appropriate
answer, and I'm not being misled by round-off, etc.
2. I would NOT use a constant penalty; I'd start with a small
constant penalty, then increase it gradually until I had a solution that
honestly satisfied the constraints.
3. Alternatively, you could use Lagrange multipliers by redefining
your objective function and including the Lagrange multiplier(s) as
other paremeter(s) to be estimated. That sounds like a sensible idea,
but I have no experience trying that. I would expect it might work fine
provided the objective function with constraints were all differentiable
and sufficiently smooth to admit only one optimum.
4. However, I believe you will gain more insight into the problem by
trying to reduce the number of unknowns rather than increase them. For
example, I noted in my earlier reply that your constraints are
equivalent to f1 = r12*f2, with r12 = r12 = (1-c)*k1/(c*(1-k1) = a
constant determined from known constants in your previous problem
statement. If it were my problem, I might focus on trying to use this
constraint to determine one of (p1, p2, p3) as a function of the other
two. For example, for what combinations in the (p1, p2) space is there
a single, unique solution, when is there 0 and when are there 2, 3, or
more?
To accomplish that, I might use 'expand.grid' to generate many
different combinations of (p1, p2) and then use 'optim' to minimize SSD
= (f1-r12*f2)^2 over variations in p3. (Of course, if it were easier to
solve for p1 or p2 in terms of the other two, I might do that.) For
each combination of (p1, p2), I'd store the resulting values of SSD, p3,
and f1. Then if any of the SSD values were numerically greater than 0,
I'd worry about those cases. Otherwise, I'd look at contour and
perspective plots of p3 and f1 vs. p1 and p2 to try to generate some
insight into this problem -- and perhaps generate a simple way to solve
for p3 and f1 from p1 and p2.
Make sense?
Hope this helps.
Spencer Graves
Iris Zhao wrote:
> Dear Spencer,
>
> Thank you very much for your helpful reply. I was trying to reproduce a
> table in one paper. After I modified my code according to your suggestion, I
> was able to get the results that are very close to those in the paper. It
> seems the starting values of the parameters to be optimized are very
> crutial. So I will have different optimal values for different starting
> vectors. How can I be sure the min value returned by optim() is the true
> optimal value?
>
> I am also curious why you choose the constant penalty to handle the
> constraint in the first place. Why not use lagrange multiplier method to
> eliminate the constraint?
>
> Thanks again. I am grateful for your help.
>
> Best regards,
> Iris
>
>
> On 7/18/06, Spencer Graves <spencer.graves at pdf.com> wrote:
>> I had good luck translating constrained into unconstrained
>> problems
>> and then optimizing the unconstrained problem. Have you tried something
>> like the following:
>>
>> Define:
>> z = c(z1, z2, z3), where p1=1/(1+exp(-z1), etc. This translates
>> the
>> constraints on the p's to
>>
>> G(z) = P*(f1(z)-r12*f2(z))^2-f1(z)
>>
>> where f1(z) = f1(p1(z1), p2(z2), p3(z3), and similarly for f2(z), and
>> where P = a penalty term,
>> and r12 = (1-c)*k1/(c*(1-k1).
>>
>> Can f2(z) ever go outside (0, 1)? If yes, I would modify G(z) by
>> adding a term like (min(0, f2(z), 1-f2(z))^2)
>>
>> If I haven't made a math error, your problem should translate
>> into
>> this form. I first solve this problem for z with P small like 1. Then
>> after I've got a solution for that, I increase P to 2, then 10, then
>> 100, etc., until the penalty is so great that the desired equality has
>> been effectively achieved.
>>
>> With 'P' fixed, 'optim' should handle this kind of problem
>> handily.
>> To learn how, I suggest you work through the examples in the ?optim help
>> page. I'd ignore the gradient, at least initially. A silly math error
>> in computing the gradient can delay a solutions unnecessarily. If you
>> need to solve thousands of problems like this for different values of
>> k1 and 'c', I might later program the gradient. However, I would not do
>> that initially.
>>
>> Also, if you are not already familiar with Venables and Ripley
>> (2002)
>> Modern Applied Statistics with S, 4th ed. (Springer -- or an earlier
>> edition), I would encourage you to spend some quality time with this
>> book. It can help you with 'optim', with contour plots, etc.
>>
>> Hope this helps,
>> Spencer Graves
>>
>> Iris Zhao wrote:
>>> Dear all,
>>>
>>>
>>>
>>> I am working on optimization problem and have some trouble running
>> optim().
>>> I have two functions (f1, f2) and 4 unknown parameters (p1, p2, p3, p4).
>>> Both f1 and f2 are functions of p1, p2, and p3, denoted by f1(p1, p2,
>> p3)
>>> and f2(p1,p2,p3) respectively.
>>>
>>>
>>>
>>> The goal is to maximize f1(p1, p2, p3) subject to two constraints:
>>>
>>> (1) c = k1*p4/(k1*p4+(1-k1)*f1(p1,p2,p3)), where c and k1 are some
>> known
>>> constants
>>>
>>> (2) p4 = f2(p1, p2, p3)
>>>
>>> In addition, each parameter ranges from 0 to 1, and both f1 and f2
>> involve
>>> integrations.
>>>
>>>
>>>
>>> I tried to use lagrange multipliers to eliminate two equality
>> constraints
>>> and then use optim() to find the maximum value and optimal parameter
>>> estimates.
>>>
>>> So I let fn be f1+lambda1*(c- k1*p4/(k1*p4+(1-k1)*f1(p1,p2,p3))) +
>>> lambda2(p4-f2(p1,p2,p3)). The error message I got was "Error in fn(par,
>> ...)
>>> : recursive default argument reference."
>>>
>>>
>>>
>>> I wonder whether current build-in functions in R can do this type of
>> jobs.
>>> Any suggestion will be greatly appreciated.
>>>
>>>
>>>
>>> Iris
>>>
>>> [[alternative HTML version deleted]]
>>>
>>> ______________________________________________
>>> R-help at stat.math.ethz.ch mailing list
>>> https://stat.ethz.ch/mailman/listinfo/r-help
>>> PLEASE do read the posting guide!
>> http://www.R-project.org/posting-guide.html
>>
>
> [[alternative HTML version deleted]]
>
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide http://www.R-project.org/posting-guide.html
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list