Am 11.01.2008 11:06, andreas.rosenblad(a)ltv.se schrieb:
svetosch(a)gmx.net @ INTERNET skrev 2008-01-11 10:37:48 :
> Am 11.01.2008 08:49, andreas.rosenblad(a)ltv.se schrieb:
>> cottrell(a)wfu.edu @ INTERNET skrev 2008-01-10 19:40:46 :
>>>> On Thu, 10 Jan 2008, Ignacio Diaz-Emparanza wrote:
>>>>> ols y 0 xlist
>>>>> omit --auto
>>>>> That's perfect !!
>>>>> But, what does mean "sequenciallly" here? 1. Find the
variable
>>> with the highest p-value (other than
>>> the constant).
>>> 2. Is that p-value greater than (say) 0.1? - If so, drop the
>>> variable, re-estimate the model and
>>> go to step 1.
>>> - If not, stop and report the results, including an F-test
>>> on all the variables omitted.
>>> So it is a backward selection precedure. It's great that it has
> been added,
>> I have missed it. Could you add a forward selection procedure too,
please?
>> In case you mean specific-to-general instead of general-to-
> specific, I don't think that's a good idea.
Why shouldn't it be a good idea?
I'm basically a follower of David Hendry here, who has written a lot
about it. For example, you start with a small model which will later be
shown to have suffered from omitted-variable bias. So you will have
shown that the inference (tests) based on the small model was rubbish.
But you used that rubbish to arrive at your preferred model, so you
can't even trust the final model's tests that say your earlier model was
rubbish. It's a vicious circle and a big ol' mess.
-sven