[R] A problem about "nlminb"
Ravi Varadhan
rvaradhan at jhmi.edu
Sun May 31 00:41:11 CEST 2009
Popo,
If you indeed have 200000 unknowns to be estimated, I would suggest that you check out spg() function in the "BB" package. This requires small storage and hence can better handle high-dimensional problems.
Ravi.
____________________________________________________________________
Ravi Varadhan, Ph.D.
Assistant Professor,
Division of Geriatric Medicine and Gerontology
School of Medicine
Johns Hopkins University
Ph. (410) 502-2619
email: rvaradhan at jhmi.edu
----- Original Message -----
From: spencerg <spencer.graves at prodsyse.com>
Date: Saturday, May 30, 2009 4:57 pm
Subject: Re: [R] A problem about "nlminb"
To: David Winsemius <dwinsemius at comcast.net>
Cc: r-help <r-help at r-project.org>
> You example is NOT self contained, which means that any
> potential
> respondent must guess what you mean by "a function with a variable of
>
> almost 200,000". The following clarifies this:
>
>
> > start0 <- rep(1, 200000)
> > msLE2 <- function(x)sum(x^2)
> > nlminb(start=start0, msLE2, control = list(x.tol = .001))
> Error in vector("double", length) : vector size specified is too large
>
>
> "traceback()" reveals that this error message was generated in
> by
> 'vector("double", length)', where length = 130 + (n * (n + 27))/2),
> and
> n = length(start) = 200,000 in this case. This is 20e9 double
> precision
> numbers or 160 GB. This suggests you need to rethink what you are
> trying to do.
>
>
> In my opinion, in any problem with more than a fairly small
> number
> of unknowns, e.g., 3 or 12 depending on the complexity of the
> problem,
> the vast majority of the unknowns will be better estimated by
> considering them as different samples from some abstract population
> and
> trying to estimate first the hyperparameters of that population and
> then
> the individuals conditioned on the hyperparameters. The most general
>
> tools for that kind of thing in R are in the 'nlme' and 'lme4'
> packages. To understand those, I highly recommend Pinheiro and Bates
>
> (2000) Mixed-Effects Models in S and S-PLUS (Springer). If your
> observations can not reasonably be considered by mixed-effects models
>
> with normal errors, a second reference is Gelman and Hill (2006) Data
>
> Analysis Using Regression and Multilevel/Hierarchical Models
> (Cambridge
> University Press). If neither of those seem adequate to your
> problem, I
> suggest you consider using the "RSiteSearch.function" in the
> RSiteSearch
> package to look for other capabilities in R related to your
> particular
> application.
>
>
> Hope this helps.
> Spencer Graves
>
>
> David Winsemius wrote:
> >
> > On May 30, 2009, at 2:19 PM, popo UBC wrote:
> >
> >> Hello everyone!
> >>
> >> When I use "nlminb" to minimize a function with a variable of
> almost
> >> 200,000
> >> dimension, I got the following error.
> >>
> >>> nlminb(start=start0, msLE2, control = list(x.tol = .001))
> >> Error in vector("double", length) : vector size specified is too large
> >> I had the following setting
> >>
> >> options(expressions=60000)
> >> options(object.size=10^15)
> >
> > That would do nothing on my machine, but then you may have a
> different
> > (unspecified) OS. You may have unrealistic expectations. 10^15
> seems a
> > bit optimistic to me, even if you were supplying that number in a
> > manner that R would recognize.
> >
> > ?mem.limits # should give you information specific to your OS.
> >
> > If you use Windoze, try also:
> >
> >
> >
> >
> >
> >
> >
> >>
> >> I have no idea about what might be wrong. Any suggestion is highly
> >> appreciated!!
> >
> > And we have no idea what sort of setup you have. You could, of
> > course, read the specifics for your OS in the Installation Guide:
> >
> > cran.r-project.org/doc/manuals/R-admin.pdf
> >
>
> ______________________________________________
> R-help at r-project.org mailing list
>
> PLEASE do read the posting guide
> and provide commented, minimal, self-contained, reproducible code.
More information about the R-help
mailing list