[R] nlme question

Wassell, James T., Ph.D. jtw2 at CDC.GOV
Mon Nov 21 15:28:42 CET 2005


Deepayan, 

Yes, thanks for confirming my suspicions.  I know mixed models are
"different" but, I did not think they were so different as to preclude
estimating the var-cov matrix (via the Hessian in Maximum likelihood, as
you point out).  

Thanks for prompting me to think about MCMC.  Your suggestion to
consider MCMC makes me realize that using BUGS, I could directly sample
from the posterior of the linear combination of parameters - to get its
variance and eliminate the extra step using the var-cov matrix.   As you
say, with results better than the asymptotic approximation. (Maybe I can
do the same thing with mcmcsamp?, but I'm not familiar with this and
will have to take a look at it.)
 
-----Original Message-----
From: Deepayan Sarkar [mailto:deepayan.sarkar at gmail.com] 
Sent: Thursday, November 17, 2005 2:22 PM
To: Doran, Harold
Cc: Wassell, James T., Ph.D.; r-help at stat.math.ethz.ch
Subject: Re: nlme question

On 11/17/05, Doran, Harold <HDoran at air.org> wrote:
> I think the authors are mistaken. Sigma is random error, and due to
its
> randomness it cannot be systematically related to anything. It is this
> ind. assumption that allows for the likelihood to be expressed as
> described in Pinhiero and Bates p.62.

I think not. The issue is dependence between the _estimates_ of sigma,
tao, etc, and that may well be present. Presumably, if one can compute
the likelihood surface as a function of the 3 parameters, the hessian
at the MLE's would give the estimated covariance. However, I don't
think nlme does this.

A different approach you might want to consider is using mcmcsamp in
the lme4 package (or more precisely, the Matrix package) to get
samples from the joint posterior distribution. This is likely to be
better than the asymptotic normal approximation in any case.

Deepayan




More information about the R-help mailing list