Hello devels,
I recently got an email from Peter Sephton (he of the critical
values for the KPSS test) raising a question about our
implementation of the ADF-GLS test.
I was thinking of going ahead and trying to implement Peter's
suggestion but then it occurred to me it would be good to get some
more input -- it's been a while since I've thought about this test.
Here's Peter's message:
<quote>
I have a question about the ADF-GLS code. I’ve been programming the
maic search in Matlab and I’m not getting the same answers as GRETL
and I think I know why.
I think GLS detrending should be done on the entire sample, BEFORE
adjustment for lags in the ADF-type test, if you look at the ERS or
the Ng Perroon 2001 papers.
Then the sample should be changed to remove lags+1 observations from
the beginning of the sample so that the search for the “optimal” lag
length is done using the model selection criterion over a common
sample period, or one can simply set the lag at a fixed value, but
this should be done over an unchanging observation space.
It appears GRETL doesn’t do the search for the optimal lag length
over a common period. This might be solved by explicitly removing
the lags+1 obs at the start of the sample, but this then removes
those observations from the GLS detrending.
This is probably a pretty picky point but I thought I should raise
it.
</quote>
Allin