[R] Advice wanted on using optim with both continuous and discrete par arguments...

Jeffrey Racine racinej at mcmaster.ca
Mon Mar 1 20:30:57 CET 2010


Hi Ravi.

Many thanks.

My problem is a mixed integer quadratic programming one (the objective function is a sum of squared jackknife residuals), so not fully DTOP.

However, the only package I see that can handle this is Rcplex that would not be to my tastes as it is commercial. Pointers to R code that can do MIQP that is open would be ideal.

Thanks again!

-- Jeff

On 2010-03-01, at 2:25 PM, Ravi Varadhan wrote:

> Hi Uwe & Jeff,
> 
> It seems to me that Jeff's problem falls under the class of "mixed-integer nonlinear programming" (MINLP) problem.  I would classify this as a "damn tough optimization problem" (DTOP!).  I am not sure if there is any R package that can handle this or there is any R link to a commercial package that can handle this (e.g. Rcplex).  The optimzation task view only talks about MILP and MIQP (mixed-integer linear and quadratic programming).
> 
> Being totally ignorant of the state-of-art methods for MINLP, I would like to suggest a "novel" approach for solving this:
> 
> Let f(X,Y) be the objective function where X and Y be the continuous and categorical variables.  You already have constraints g(X) on X.  Now, let us treat Y as continuous.  Let us define some artificial constraints on Y.  For example, if we have only one variable y, let us define a constraint function that penalizes y as it gets away from integer values I = {0, 1, ..., K}, call this h(y; I, \sigma).  This function is defined such that it is zero on the integer set I, but increases monotonically as a function of the distance away from the integer points.  The parameter \sigma is like a scaling parameter.  It should be relatively easy to come up with a smooth penalty function (may be some kind of a K-sum of reciprocal of Gaussian densities).  You can then solve the resulting nonlinear programming problem for a sequence of \sigma values starting with a large dispersion and then decreasing it towards zero.  You could use extrapolation to extrapolate to the limit as \sigma goes 
> to zero.  Of course, you will then round-off the solution at the end.  Does this make sense?
> 
> Ravi.
> 
> ____________________________________________________________________
> 
> Ravi Varadhan, Ph.D.
> Assistant Professor,
> Division of Geriatric Medicine and Gerontology
> School of Medicine
> Johns Hopkins University
> 
> Ph. (410) 502-2619
> email: rvaradhan at jhmi.edu
> 
> 
> ----- Original Message -----
> From: Uwe Ligges <ligges at statistik.tu-dortmund.de>
> Date: Monday, March 1, 2010 12:59 pm
> Subject: Re: [R] Advice wanted on using optim with both continuous and discrete par arguments...
> To: Jeffrey Racine <racinej at mcmaster.ca>
> Cc: "r-help at r-project.org" <r-help at r-project.org>
> 
> 
>> 
>> On 01.03.2010 18:34, Jeffrey Racine wrote:
>>> Thanks Uwe.
>>> 
>>> I appreciate your feedback.. in the paragraph in my email beginning 
>> "The problem..."
>> 
>> Whoops, apologies, I was too quickly reading your message, apparently.
>> CCing R-help to add:
>> 
>> There is the Optimization Task View on CRAN:
>> 
>> 
>> See particularly the hints related to Mixed integer programming and 
>> its 
>> variants
>> 
>> Best wishes,
>> uwe
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>>> I point out that yes, I indeed do what you suggest for small 
>> problems, but encounter problems where this is not feasible (e.g., 10 
>> 
>> variables with integer ranging from 0-20 for each variable for instance).
>>> 
>>> Thanks again!
>>> 
>>> -- Jeff
>>> 
>>> On 2010-03-01, at 12:28 PM, Uwe Ligges wrote:
>>> 
>>>> 
>>>> 
>>>> On 01.03.2010 16:34, Jeffrey Racine wrote:
>>>>> Dear R users,
>>>>> 
>>>>> I have a problem for which my objective function depends on both 
>> discrete and continuous arguments.
>>>>> 
>>>>> The problem is that the number of combinations for the 
>> (multivariate) discrete arguments can become overwhelming (when it is 
>> univariate this is not an issue) hence search over the continuous 
>> arguments for each possible combination of the discrete arguments may 
>> not be feasible. Guided search over the discrete and continuous 
>> arguments would be infinitely preferable. That is, exhaustive search 
>> over all possible combinations works perfectly, but for large problems 
>> exhaustive search simply is not in the feasible set.
>>>>> 
>>>>> Both the discrete and continuous arguments are bounded (the 
>> discrete lie in [0,K] and the continuous in [0,1]) and I am using 
>> L-BFGS-B with lower and upper vectors defining these bounds.
>>>>> 
>>>>> The issue is that when I feed optim my objective function and par 
>> (whose first `k' elements must necessarily be rounded by my objective 
>> function while the remaining `l' arguments are continuous), the 
>> default settings naturally do not perform well at all. Typically if 
>> the initial values for the discrete variables are, say, par[1:3]= 
>> c(2,3,4) while those for the continuous are, say, par[4:6] = c(.17, 
>> .35, .85), then optim finds the minimum for only the continuous 
>> variables and dumps back the initial values for the discrete 
>> variables. I presume that the numerical approximation to the 
>> gradients/hessian is thrown off by the `flat spots' or 
>> step-like-nature of the objective function with respect to the 
>> discrete variables.
>>>>> 
>>>>> I have played with ndeps, parscale etc. but nothing really works. 
>> I realize this is a mixed combinatorial optimization/continuous 
>> optimization problem and ideally would love pointers to any related 
>> literature or ideally an R package that implements such a beast.
>>>>> 
>>>>> However, if anyone has attempted to use optimization routines 
>> native to R with any success in similar settings, I would love to get 
>> your feedback.
>>>> 
>>>> 
>>>> If your 3 first discrete variables have a rather limited number of 
>> possible values (sounds like that is the case), you may want to apply 
>> optim() on the other variables on a complete grid of the first 3 
>> variables and select the best result. Even with this complete 
>> evaluation of the whole grid with say 20 possible values in each 
>> dimension the results should be there within minutes without a need 
>> for more specialized optimization procedures. ...
>>>> 
>>>> Best wishes,
>>>> Uwe Ligges
>>>> 
>>>> 
>>>> 
>>>>> Many thanks in advance for your advice.
>>>>> 
>>>>> -- Jeff
>>>>> 
>>>>> 
>>>>> 
>>>>> Professor J. S. Racine         Phone:  (905) 525 9140 x 23825
>>>>> Department of Economics        FAX:    (905) 521-8232
>>>>> McMaster University            e-mail: racinej at mcmaster.ca
>>>>> 1280 Main St. W.,Hamilton,     URL: 
>>>>> Ontario, Canada. L8S 4M4
>>>>> 
>>>>> `The generation of random numbers is too important to be left to 
>> chance'
>>>>> 
>>>>> ______________________________________________
>>>>> R-help at r-project.org mailing list
>>>>> 
>>>>> PLEASE do read the posting guide 
>>>>> and provide commented, minimal, self-contained, reproducible code.
>>> 
>>> Professor J. S. Racine         Phone:  (905) 525 9140 x 23825
>>> Department of Economics        FAX:    (905) 521-8232
>>> McMaster University            e-mail: racinej at mcmaster.ca
>>> 1280 Main St. W.,Hamilton,     URL: 
>>> Ontario, Canada. L8S 4M4
>>> 
>>> `The generation of random numbers is too important to be left to chance'
>>> 
>>> 
>>> 
>>> 
>> 
>> ______________________________________________
>> R-help at r-project.org mailing list
>> 
>> PLEASE do read the posting guide 
>> and provide commented, minimal, self-contained, reproducible code. 

Professor J. S. Racine         Phone:  (905) 525 9140 x 23825
Department of Economics        FAX:    (905) 521-8232
McMaster University            e-mail: racinej at mcmaster.ca
1280 Main St. W.,Hamilton,     URL: http://www.economics.mcmaster.ca/racine/
Ontario, Canada. L8S 4M4

`The generation of random numbers is too important to be left to chance'



More information about the R-help mailing list