[R] panel data unit root tests

Spencer Graves spencer.graves at pdf.com
Sat Jan 14 03:15:15 CET 2006


	  If replies to this post will no longer be useful for you, then please 
discregard this comment.  If not, I will start by saying that I could 
not understand enough of your question to respond directly. 
RSiteSearch("panel unit root") led me to an exchange we had following a 
related question you posted last October 
(ttp://finzi.psych.upenn.edu/R/Rhelp02a/archive/63545.html).  Did you 
try the nlme package as suggested there?  I've just now looked at that, 
and I confess I could not figure out how to do it in a few minutes.

	  Do you want to perform a unit root test for a particular panel data 
set you have?  Or do you want to develop software for a particular panel 
unit root test?  If the former, I suggest you prepare a very simple toy 
example trying to do something like this using lme with perhaps corAR1, 
then send this list a question on whether it is possible to do what you 
want with lme, asking how to take the next step with the toy example. 
If rather you want to develop software for a particular unit root test, 
then I suggest you at least provide a more complete citation than just 
"a la Arellano & Bond."  In either case, I also suggest you review the 
posting guide! "www.R-project.org/posting-guide.html" and try to make 
your post as easily understood as possible.  In general, I think that 
posts that are clear and succinct tend to get quicker, more useful 
answers.  Maybe I'm just dense, but I don't understand what you are 
asking.  For example, your code includes "print(levinlin(ws, year, id, 
lags = 3))".  What is the "levinlin" function?  RSiteSearch("levinlin") 
produced zero hits.

	  hope this helps.
	  spencer graves

jukka ruohonen wrote:

> When finally got some time to do some coding, I started and stopped right 
> after. The stationary test is a good starting point because it demonstrates 
> how we should be able to move the very basic R matrices. I have a real-
> world small N data set with 
> 
> rows:
> id(n=1)---t1---variable1
> ...
> id=(N=20)---T=21---variable1
> 
> Thus, a good test case. For first id I was considering something like this:
> 
> lag <- as.integer(lags)
> lags.p <- lags + 1
> id <- unique(group)
> id.l <- length(id)
> y.l <- length(y)
> yid.l <- length(y)/id.l
>   if (lag > yid.l -2) 
>         stop("\nlag too long for defined cross-sections.\n")
> 
> #for (i in id) {
>   lagy <- y[2:yid.l]
>       lagy.em <- embed(lagy, lags)
>   id.l <- length(id)
>   dy <- diff(y)[1:yid.l-1]
>       dy.em <- embed(dy, lags)
> #     }
> print(levinlin(ws, year, id, lags = 3))
> 
> Couldn't figure the loop over units out but with N = 1 the data 
> transformation seemed to work just fine. Now we should pool the new 
> variables within the panel and regress y over yt-1 and dy-t1 +....+ dy-t-j 
> with, say, BIC doing the job for d's (H0: y-1 ~ 0) for each in the panel. 
> 
> Now the above example puts the right-hand on columns, and if we are dealing 
> with panel models in general, we should store the new variables together 
> with dX's, which should then give clues to IV estimator with e.g. 
> orthogonal deviations, e.g. k <- y ~ yt-1 + x + as.factor(id)). So one 
> confusing part is the requirement of some big storage base for different 
> matrices doing the IV business with lags/levels - the amount of instruments 
> can be enormous with possibly calculation problems in a GMM dynamic panel 
> estimator a la Arellano & Bond. Therefore one should code the theoretically 
> relevant instruments beforehand with various transformation matrices. Thus, 
> should I start to study something that can be done with the newly added 
> SparseM package? 
> 
> Regards,
> 
> Jukka Ruohonen.
> 
> ______________________________________________
> R-help at stat.math.ethz.ch mailing list
> https://stat.ethz.ch/mailman/listinfo/r-help
> PLEASE do read the posting guide! http://www.R-project.org/posting-guide.html




More information about the R-help mailing list