Hi all,
I remember a longer while ago that we had to block some filter functions
to spuriously work on panel data, because they were mixing the units
together. Now I wonder if something similar is still happening with
commands like corrgm?
Example:
<hansl>
open grunfeld
corrgm invest
</hansl>
I'm getting a default output up to lag 23, while this dataset only has a
time dimension of 20. So this looks a bit suspicious.
What's happening there?
thanks
sven