[R] panel data unit root tests

Spencer Graves spencer.graves at pdf.com
Sun Jan 15 01:32:47 CET 2006


	  Can anyone help with questions on correlations in lme?  I provide 
below a self-contained toy example that I can't quite complete.

QUESTIONS:
1.  Is there a unit root test in R?  Below please find a failed
attempt to do this using "lme".
2.  Is there a corStruct class for first differences (unit root)?
"corAR1(1)" generates an error.
3.  How can store in another R object the estimated AR(1) coefficient 
from an lme object generated with 'lme(..., 
correlation=corAR1(form=~1|subj))'?
4.  Why am I getting small p-values in testing for a unit root when I
simulate an AR1(1)?

TOY EXAMPLE

nSubj <- 4; nObs <- 5; nSims <- 2
UnitRootSims <- array(NA, dim=c(nSims, 2),
       dimnames=list(NULL, c("AR1", "p.value")))
corAR1. <- corAR1(form=~1|subj)
corUnitRoot <- corAR1(1, form=~1|subj, fixed=TRUE)
# corAR1(1) not allowed.
corUnitRoot <- corAR1(0.99999, form=~1|subj, fixed=TRUE)

set.seed(123)
library(nlme)
for(i in 1:nSims){
   e.subj <- array(rnorm(nObs*nSubj), dim=c(nObs, nSubj),
       dimnames=list(paste("obs", 1:nObs, sep=""), subj))
   y.subj <- apply(e.subj, 2, cumsum)
   Dat <- data.frame(subj=rep(subj, each=nObs),
                   y=as.vector(y.subj))
   fitAR1 <- lme(y~1, data=Dat, random=~1|subj,
        correlation=corAR1., method="ML")
# UnitRootSims[i, 1] <- ??? How to extract the AR1 estimate?
   fitUnitRoot <- lme(y~1, data=Dat, random=~1|subj,
        correlation=corUnitRoot, method="ML")
   aovUnitRoot <- anova(fitAR1, fitUnitRoot)
   UnitRootSims[i, 2] <- aovUnitRoot[2, "p-value"]
}
 > UnitRootSims
      AR1      p.value
[1,]  NA 8.277712e-11
[2,]  NA 1.174823e-08
# Why are the p-values so small?  Shouldn't they be insignificant?

	  Thanks,
	  Spencer Graves

############################
	  If replies to this post will no longer be useful for you, then please
discregard this comment.  If not, I will start by saying that I could
not understand enough of your question to respond directly.
RSiteSearch("panel unit root") led me to an exchange we had following a
related question you posted last October
(ttp://finzi.psych.upenn.edu/R/Rhelp02a/archive/63545.html).  Did you
try the nlme package as suggested there?  I've just now looked at that,
and I confess I could not figure out how to do it in a few minutes.

	  Do you want to perform a unit root test for a particular panel data
set you have?  Or do you want to develop software for a particular panel
unit root test?  If the former, I suggest you prepare a very simple toy
example trying to do something like this using lme with perhaps corAR1,
then send this list a question on whether it is possible to do what you
want with lme, asking how to take the next step with the toy example.
If rather you want to develop software for a particular unit root test,
then I suggest you at least provide a more complete citation than just
"a la Arellano & Bond."  In either case, I also suggest you review the
posting guide! "www.R-project.org/posting-guide.html" and try to make
your post as easily understood as possible.  In general, I think that
posts that are clear and succinct tend to get quicker, more useful
answers.  Maybe I'm just dense, but I don't understand what you are
asking.  For example, your code includes "print(levinlin(ws, year, id,
lags = 3))".  What is the "levinlin" function?  RSiteSearch("levinlin")
produced zero hits.

	  hope this helps.
	  spencer graves

jukka ruohonen wrote:

> When finally got some time to do some coding, I started and stopped right 
> after. The stationary test is a good starting point because it demonstrates 
> how we should be able to move the very basic R matrices. I have a real-
> world small N data set with 
> 
> rows:
> id(n=1)---t1---variable1
> ...
> id=(N=20)---T=21---variable1
> 
> Thus, a good test case. For first id I was considering something like this:
> 
> lag <- as.integer(lags)
> lags.p <- lags + 1
> id <- unique(group)
> id.l <- length(id)
> y.l <- length(y)
> yid.l <- length(y)/id.l
>   if (lag > yid.l -2) 
>         stop("\nlag too long for defined cross-sections.\n")
> 
> #for (i in id) {
>   lagy <- y[2:yid.l]
>       lagy.em <- embed(lagy, lags)
>   id.l <- length(id)
>   dy <- diff(y)[1:yid.l-1]
>       dy.em <- embed(dy, lags)
> #     }
> print(levinlin(ws, year, id, lags = 3))
> 
> Couldn't figure the loop over units out but with N = 1 the data 
> transformation seemed to work just fine. Now we should pool the new 
> variables within the panel and regress y over yt-1 and dy-t1 +....+ dy-t-j 
> with, say, BIC doing the job for d's (H0: y-1 ~ 0) for each in the panel. 
> 
> Now the above example puts the right-hand on columns, and if we are dealing 
> with panel models in general, we should store the new variables together 
> with dX's, which should then give clues to IV estimator with e.g. 
> orthogonal deviations, e.g. k <- y ~ yt-1 + x + as.factor(id)). So one 
> confusing part is the requirement of some big storage base for different 
> matrices doing the IV business with lags/levels - the amount of instruments 
> can be enormous with possibly calculation problems in a GMM dynamic panel 
> estimator a la Arellano & Bond. Therefore one should code the theoretically 
> relevant instruments beforehand with various transformation matrices. Thus, 
> should I start to study something that can be done with the newly added 
> SparseM package? 
> 
> Regards,
> 
> Jukka Ruohonen.
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html




More information about the R-help mailing list