[R] panel data unit root tests
jukka ruohonen
jukka.ruohonen at helsinki.fi
Fri Jan 6 22:56:08 CET 2006
When finally got some time to do some coding, I started and stopped right
after. The stationary test is a good starting point because it demonstrates
how we should be able to move the very basic R matrices. I have a real-
world small N data set with
rows:
id(n=1)---t1---variable1
...
id=(N=20)---T=21---variable1
Thus, a good test case. For first id I was considering something like this:
lag <- as.integer(lags)
lags.p <- lags + 1
id <- unique(group)
id.l <- length(id)
y.l <- length(y)
yid.l <- length(y)/id.l
if (lag > yid.l -2)
stop("\nlag too long for defined cross-sections.\n")
#for (i in id) {
lagy <- y[2:yid.l]
lagy.em <- embed(lagy, lags)
id.l <- length(id)
dy <- diff(y)[1:yid.l-1]
dy.em <- embed(dy, lags)
# }
print(levinlin(ws, year, id, lags = 3))
Couldn't figure the loop over units out but with N = 1 the data
transformation seemed to work just fine. Now we should pool the new
variables within the panel and regress y over yt-1 and dy-t1 +....+ dy-t-j
with, say, BIC doing the job for d's (H0: y-1 ~ 0) for each in the panel.
Now the above example puts the right-hand on columns, and if we are dealing
with panel models in general, we should store the new variables together
with dX's, which should then give clues to IV estimator with e.g.
orthogonal deviations, e.g. k <- y ~ yt-1 + x + as.factor(id)). So one
confusing part is the requirement of some big storage base for different
matrices doing the IV business with lags/levels - the amount of instruments
can be enormous with possibly calculation problems in a GMM dynamic panel
estimator a la Arellano & Bond. Therefore one should code the theoretically
relevant instruments beforehand with various transformation matrices. Thus,
should I start to study something that can be done with the newly added
SparseM package?
Regards,
Jukka Ruohonen.
More information about the R-help
mailing list