On Wed, 27 May 2015, Koray Simsek wrote:
I've been trying to figure out why the test-down option behaves
differently
in coint and adf. In the help file, it is explained that the approach in
adf is used in coint as well, but the output is inconsistent with this for
the initial DF steps in coint (steps 1 and 2).
I would expect both functions to run the adf on a subset of the data (for
comparability reasons, I presume) that is consistent with the maximum lag.
Initially, this appears to be the case as the MAIC values for both adf and
coint are identical. BUT after finding the number of lags with minimum
MAIC, adf reports the results for the full set wheres coint reports the
results for the subset of data where original estimation is done.
This area was worked over in CVS not so long ago and is now
consistent. The new version of the code is also in the snapshots for
Windows and Mac.
Allin Cottrell