Hi,
If you run a regression like Y= XB + GH + error where GH is unobserved so
the actual regression you will run is Y = XB + (GH + error) and then use the
output to forecast various things. If G has variables in it that trend in
one direction or another over time ( i.e population in a given area) then
will the forecast get less accurate as we get away from the time period used
to estimate the coefficients and if so, is there some test that can be used
to give an idea of how often the model should be re-estimated? This would
also mean that the R^2 should decrease as we get away from the time period
used for estimation right?
Thanks,
Chris
Show replies by date