On 08/15/2012 02:58 PM, artur tarassow wrote:
Dear gretl list, I know that this is a bit off-topic... but
nevertheless
maybe somebody could help me with this.
I am attempting to program the Chow-test for VAR models as implemented
in JMulti. In order to incorporate the bootstrapping part, the original
reduced-form VAR needs to be simulated many times based on a resampling
technique, as e.g. described in Lütkepohl's "New Introduction to
Multiple Time Series Analysis".
My problem is the following:
In the case of estimating a VAR with deterministic terms and/or
exogenous, I am not sure how to account for these terms correctly. In
the reference of the "varsimul" command it is stated that these terms
can be handled by folding them into the U matrix.
I attached an example based on a VAR(1) including a linear trend. The
problem is about line 20 in the code, where "matrix U = u .+ B" accounts
for the constant, but I am not sure whether "matrix U = u .+ B .+ DD"
would be the correct form to account for the linear trend as well.
Well it's clear you need the trend term, without it it's not the same
model. In principle your code looks ok I guess, but I haven't checked
whether you picked the right coefficients.
One further thing to check: you define your 'trend' variable yourself,
but the scaling (starting value) is arbitrary. E.g. you could use
0,1,2... or 1,2,3... The scaling will be picked up by the estimated
constant term. So to really simulate the same model, you need to make
sure that the scaling of the trend term in gretl's 'var ... --trend'
command is the same as your own trend term. This could get tricky if
gretl's trend term were bound to the dataset range (the index variable),
which means that the starting value would depend on the effective sample
which is chosen.
Don't know if I made myself clear here.
hth,
sven