Dear gretl list, I know that this is a bit off-topic... but nevertheless maybe somebody could help me with this.

I am attempting to program the Chow-test for VAR models as implemented in JMulti. In order to incorporate the bootstrapping part, the original reduced-form VAR needs to be simulated many times based on a resampling technique, as e.g. described in Lütkepohl's "New Introduction to Multiple Time Series Analysis".

My problem is the following:
In the case of estimating a VAR with deterministic terms and/or exogenous, I am not sure how to account for these terms correctly. In the reference of the "varsimul" command it is stated that these terms can be handled by folding them into the U matrix.

I attached an example based on a VAR(1) including a linear trend. The problem is about line 20 in the code, where "matrix U = u .+ B" accounts for the constant, but I am not sure whether "matrix U = u .+ B .+ DD" would be the correct form to account for the linear trend as well.

Thanks in advance,
Artur