Monte Carlo Simulation
by Alessandra .Cruanes Aguilar
Dear all,
I have been having some troubles computing a Monte Carlo simulation.
I am a student, so please don't be too horrified by my short experience :)
The problem is the following:
I am requested to perform a MC simulation on a lagged-dependent variable
model with autocorrelated residuals. The estimator for the lagged-dependent
variable, the only regressor, should be inconsistent and biased.
My concern is that decreasing the sample size the bias decreases, and I
guess that that's just wrong.
(I also tried using robust std errors, and nothing changes!)
here's the code I used:
nulldata 200
> setobs --time-series
>
> scalar n=$nobs
>
> scalar beta1=0.8
> scalar rho=0.4
>
> smpl 1 1
> series u=0
> series y=0
>
> smpl 2 200
> series e=randgen(N,0,1)
> series u=rho*u(-1)+e
> series y=beta1*y(-1)+u
>
>
> ols y y(-1) --simple
>
>
> loop 5000 --progressive --quiet
>
>
> series e=randgen(N,0,1)
> series u=rho*u(-1)+e
> series y=beta1*y(-1)+u
>
> ols y y(-1)
>
> scalar beta1hat= $coeff(y_1)
> scalar s1= $stderr(y_1)
> scalar bias=$coeff(y_1)-beta1
> genr LB = beta1hat - critical(t,$df,0.025)*s1
> genr UB = beta1hat + critical(t,$df,0.025)*s1
> print beta1 beta1hat bias LB UB
>
> endloop
>
>
> smpl 50 --random --replace
>
> loop 5000 --progressive --quiet
>
> series e=randgen(N,0,1)
> series u=rho*u(-1)+e
> series y=beta1*y(-1)+u
>
> ols y y(-1)
>
> scalar beta1hat= $coeff(y_1)
> scalar s1= $stderr(y_1)
> scalar bias=$coeff(y_1)-beta1
> genr LB = beta1hat - critical(t,$df,0.025)*s1
> genr UB = beta1hat + critical(t,$df,0.025)*s1
> print beta1 beta1hat bias LB UB
>
> endloop
>
Thank you so much