On Tue, 22 May 2012, Anutechia Asongu wrote:
I have a little qualm. I'm using System-GMM to model a
hypothesis. The 'asymptotic standard error' option eases the
significance of estimated coefficients. Is there a 'thumb rule'
for the inclusion for this option?
The rule of thumb is "don't use them"! (Unless perhaps you have a
very large number of observations.) They're known to be misleading
in finite samples.
Allin Cottrell