[R] Fitting GLM with BFGS algorithm
Prof. John C Nash
nashjc at uottawa.ca
Wed Oct 27 16:44:33 CEST 2010
In the sort of problem mentioned below, the suggestion to put in gradients (I believe this
is what is meant by "minus score vector") is very important. Using analytic gradients is
almost always a good idea in optimization of smooth functions for both efficiency of
computation and quality of results.
Also users may want to use either updated codes (Rvmmin is BFGS algorithm with box
constraints; ucminf does it unconstrained) or different approaches, depending on the
function. Package optimx lets users discover relative properties of different optimizers
on their class of problems.
John Nash
> From: Dimitris Rizopoulos <d.rizopoulos at erasmusmc.nl>
> To: justin bem <justin_bem at yahoo.fr>
> Cc: R Maillist <r-help at stat.math.ethz.ch>
> Subject: Re: [R] Fitting GLM with BFGS algorithm
> Message-ID: <4CC6C0DB.9070800 at erasmusmc.nl>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> for instance, for logistic regression you can do something like this:
>
> # simulate some data
> x <- cbind(1, runif(100, -3, 3), rbinom(100, 1, 0.5))
> y <- rbinom(100, 1, plogis(c( x%*% c(-2, 1, 0.3))))
>
> # BFGS from optim()
> fn <- function (betas, y, x) {
> -sum(dbinom(y, 1, plogis(c(x %*% betas)), log = TRUE))
> }
> optim(rep(0, ncol(x)), fn, x = x, y = y, method = "BFGS")
>
> # IWLS from glm()
> glm(y ~ x[, -1], family = "binomial")
>
> You can also improve it by providing the minus score vector as a third
> argument to optim().
>
>
> I hope it helps.
>
> Best,
> Dimitris
More information about the R-help
mailing list