Re: [Gretl-users] Constant in the log-likelihood
by Alecos Papadopoulos
Christ, what a stupid mistake. I apologize to everybody that has been
following this conversation.
Τhe correct constant is ln(4/sqrt(2*$pi))= 0.467355827915218...
...and now the code runs smoothly if I include the constant as an
expression, ln(4/sqrt(2*$pi)), it gives a zero gradient and the correct
value for the log-l, and estimates virtually identical with the mle code
without the constant.
It still has problems if I include the constant as a numerical value
0.467355827915218... in which case I get
loglikelihood = -172.877055337 (steplength = 5.36871e-021)
Gradients: 0.98941 -16.941 65.400 61.632 -0.42174 0.31255
-0.078953 0.32094 -0.099944 -0.028678 1.6085 23.937
0.80591 6.6747e-005 0.045333 0.0018853 (norm 7.16e-001)
...which is obviously not at the maximum. Also, here one of the non-negative terms is estimated as
as close to zero as possible - meaning that the iteration method was led towards negative values but wasn't
permitted, and this is perhaps why the gradient is not zero.
Also if I truncate the constant to only 7 decimal digits (instead of 15) the code runs and gives estimates and a non-zero gradient as above, but it also gives the message "Matrix is not positive definite , Error executing script: halting". Perhaps this explains some things to those who know more...
Still, Allin thanks, My issue is essentially solved. And apologies again.
Alecos Papadopoulos
Athens University of Economics and Business, Greece
Department of Economics
cell:+30-6945-378680
fax: +30-210-8259763
skype:alecos.papadopoulos
On 11/7/2013 19:00, gretl-users-request(a)lists.wfu.edu wrote:
> Send Gretl-users mailing list submissions to
> gretl-users(a)lists.wfu.edu
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.wfu.edu/mailman/listinfo/gretl-users
> or, via email, send a message with subject or body 'help' to
> gretl-users-request(a)lists.wfu.edu
>
> You can reach the person managing the list at
> gretl-users-owner(a)lists.wfu.edu
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Gretl-users digest..."
>
>
> Today's Topics:
>
> 1. Re: Constant in log-likelihood (Alecos Papadopoulos)
> 2. Re: Constant in log-likelihood (Allin Cottrell)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Wed, 10 Jul 2013 19:18:08 +0300
> From: Alecos Papadopoulos <papadopalex(a)aueb.gr>
> Subject: Re: [Gretl-users] Constant in log-likelihood
> To: gretl-users(a)lists.wfu.edu
> Message-ID: <51DD8940.4030204(a)aueb.gr>
> Content-Type: text/plain; charset=ISO-8859-1; format=flowed
>
> In model 1 (without the constant in the log-l), the value of the
> maximized log-likelihood is -447.
> If one wants to arrive at the actual full value of the log-likelihood
> one should add to this nobs*ln(4/sqrt(2/$pi)) = 595*0.46.. = 274, and
> obtain -447 + 274 = -173.
>
> Now in model 2, parameter estimates are virtually identical with model
> 1, so one would expect that, since here the constant is already included
> in the logl eq., it would arrive at a comparable result with the
> corrected full value (i.e. -173), and not +511, which is the value for
> the logl given by the code in model 2.
>
> Alecos Papadopoulos
> Athens University of Economics and Business, Greece
> Department of Economics
> cell:+30-6945-378680
> fax: +30-210-8259763
> skype:alecos.papadopoulos
>
> On 10/7/2013 19:00, gretl-users-request(a)lists.wfu.edu wrote:
>
>> ------------------------------ Message: 2 Date: Wed, 10 Jul 2013
>> 09:05:57 +0200 (CEST)
>> From: "Riccardo (Jack) Lucchetti" <r.lucchetti(a)univpm.it>
>> Subject: Re: [Gretl-users] Graph two densities together and Constant
>> in Log Likelihood To: Gretl list <gretl-users(a)lists.wfu.edu>
>> Message-ID: <alpine.DEB.2.10.1307100904510.23263(a)ec-4.econ.univpm.it>
>> Content-Type: text/plain; charset="iso-8859-15" On Wed, 10 Jul 2013,
>> Alecos Papadopoulos wrote:
>>> Just a note that
>>> in model 2 (that is virtually identical to model 1 as regards to
>>> estimates and final gradient values), the value of the logl appears
>>> positive.
>> Since you add a positive constant to each observation, why would that be
>> surprising?
>>
>> -------------------------------------------------------
>> Riccardo (Jack) Lucchetti
>> Dipartimento di Scienze Economiche e Sociali (DiSES)
>>
>> Universit? Politecnica delle Marche
>> (formerly known as Universit? di Ancona)
>>
>> r.lucchetti(a)univpm.it
>> http://www2.econ.univpm.it/servizi/hpp/lucchetti
>> -------------------------------------------------------
>>
>> ------------------------------
>>
>> _______________________________________________
>> Gretl-users mailing list
>> Gretl-users(a)lists.wfu.edu
>> http://lists.wfu.edu/mailman/listinfo/gretl-users
>>
>> End of Gretl-users Digest, Vol 78, Issue 12
>> *******************************************
>>
>
>
> ------------------------------
>
> Message: 2
> Date: Wed, 10 Jul 2013 12:43:09 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] Constant in log-likelihood
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307101237360.16292@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Wed, 10 Jul 2013, Alecos Papadopoulos wrote:
>
>> In model 1 (without the constant in the log-l), the value of the
>> maximized log-likelihood is -447.
>> If one wants to arrive at the actual full value of the log-likelihood
>> one should add to this nobs*ln(4/sqrt(2/$pi)) = 595*0.46.. = 274, and
>> obtain -447 + 274 = -173.
> ln(4/sqrt(2/$pi)) is not 0.46..., it's 1.612... And multiplication
> by 595 gives an add factor of 959. You're calculating using 2*$pi
> rather than 2/$pi?
>
> Allin Cottrell
>
>
>
>
>
> ------------------------------
>
> _______________________________________________
> Gretl-users mailing list
> Gretl-users(a)lists.wfu.edu
> http://lists.wfu.edu/mailman/listinfo/gretl-users
>
> End of Gretl-users Digest, Vol 78, Issue 13
> *******************************************
>
11 years, 6 months
Re: [Gretl-users] Constant in log-likelihood
by Alecos Papadopoulos
In model 1 (without the constant in the log-l), the value of the
maximized log-likelihood is -447.
If one wants to arrive at the actual full value of the log-likelihood
one should add to this nobs*ln(4/sqrt(2/$pi)) = 595*0.46.. = 274, and
obtain -447 + 274 = -173.
Now in model 2, parameter estimates are virtually identical with model
1, so one would expect that, since here the constant is already included
in the logl eq., it would arrive at a comparable result with the
corrected full value (i.e. -173), and not +511, which is the value for
the logl given by the code in model 2.
Alecos Papadopoulos
Athens University of Economics and Business, Greece
Department of Economics
cell:+30-6945-378680
fax: +30-210-8259763
skype:alecos.papadopoulos
On 10/7/2013 19:00, gretl-users-request(a)lists.wfu.edu wrote:
> ------------------------------ Message: 2 Date: Wed, 10 Jul 2013
> 09:05:57 +0200 (CEST)
> From: "Riccardo (Jack) Lucchetti" <r.lucchetti(a)univpm.it>
> Subject: Re: [Gretl-users] Graph two densities together and Constant
> in Log Likelihood To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.DEB.2.10.1307100904510.23263(a)ec-4.econ.univpm.it>
> Content-Type: text/plain; charset="iso-8859-15" On Wed, 10 Jul 2013,
> Alecos Papadopoulos wrote:
>> Just a note that
>> in model 2 (that is virtually identical to model 1 as regards to
>> estimates and final gradient values), the value of the logl appears
>> positive.
> Since you add a positive constant to each observation, why would that be
> surprising?
>
> -------------------------------------------------------
> Riccardo (Jack) Lucchetti
> Dipartimento di Scienze Economiche e Sociali (DiSES)
>
> Universit? Politecnica delle Marche
> (formerly known as Universit? di Ancona)
>
> r.lucchetti(a)univpm.it
> http://www2.econ.univpm.it/servizi/hpp/lucchetti
> -------------------------------------------------------
>
> ------------------------------
>
> _______________________________________________
> Gretl-users mailing list
> Gretl-users(a)lists.wfu.edu
> http://lists.wfu.edu/mailman/listinfo/gretl-users
>
> End of Gretl-users Digest, Vol 78, Issue 12
> *******************************************
>
11 years, 6 months
Re: [Gretl-users] Graph two densities together and Constant in Log Likelihood
by Alecos Papadopoulos
Ricardo (Jack), thanks for the continuing support.
GRAPH TWO DENSITIES TOGETHER : I saw the function you directed me to in
action using the examples with the series from the gretl data bases, now
I am trying to understand what it actually does and whether it suits my
purposes... Exploration is fun.
CONSTANT IN THE LOG-LIKELIHOOD: I will try what you suggest, and also I
few more things I have in mind, and will report back. Just a note that
in model 2 (that is virtually identical to model 1 as regards to
estimates and final gradient values), the value of the logl appears
positive.
Alecos Papadopoulos
Athens University of Economics and Business, Greece
Department of Economics
cell:+30-6945-378680
fax: +30-210-8259763
skype:alecos.papadopoulos
>> GRAPH OF TWO DENSITIES TOGETHER: Thanks for providing the older link.
>> Although the code there is to plot two densities /consecutively /from left to
>> right, while what I need to do is to /superimpose/ them - and this I realize
>> now has the problem of having two different abscissaes series. Still, I
>> learned something new about handling plots in Gretl.
> Really? Have you seen this?
>
> http://lists.wfu.edu/pipermail/gretl-users/2013-April/008747.html
>
>
>> CONSTANT IN LOG-LIKELIHOOD
>> The basic code *without the constant in the log-l *is (omitting the initial
>> part where OLS executes to obtain initial values)
> [...]
>
>> *COMMENT: **slope coefficients are again comparable and the value of the
>> likelihood is close to what it should have been if its constant term was
>> added afterwards. But the estimates of the three variance terms v0 v1 v2 are
>> totally different, the one reaching the specified boundary of the parameter
>> space (zero). *
> This is very strange indeed. It *may* have something to do with the
> machine epsilon of your computer, but still it's very strange. Basically,
> models 1 and 2 converge to the same maximum (with negligible differences);
> model 3 really doesn't converge at all: BFGS gives you a spurious
> convergence message, but you're not on the maximum. Weird.
>
> Here's a couple of things you may try just to see what happens:
>
> * try using "set bfgs_richardson on"; this uses a different algorithm for
> computing numerical derivatives. Slower, but much more accurate.
>
> * re-parametrise your model so to avoid estimating quantities, such as
> variances, which have a lower bound. Try logarithms instead, for example.
>
>
> -------------------------------------------------------
> Riccardo (Jack) Lucchetti
> Dipartimento di Scienze Economiche e Sociali (DiSES)
>
> Universit? Politecnica delle Marche
> (formerly known as Universit? di Ancona)
>
> r.lucchetti(a)univpm.it
> http://www2.econ.univpm.it/servizi/hpp/lucchetti
> -------------------------------------------------------
>
> ------------------------------
>
11 years, 6 months
Re: [Gretl-users] Gretl-users Digest, Vol 78, Issue 8
by cociuba mihai
Dear Allin,
thank you for your help.
Mihai
On Tue, Jul 9, 2013 at 4:00 PM, <gretl-users-request(a)lists.wfu.edu> wrote:
> Send Gretl-users mailing list submissions to
> gretl-users(a)lists.wfu.edu
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.wfu.edu/mailman/listinfo/gretl-users
> or, via email, send a message with subject or body 'help' to
> gretl-users-request(a)lists.wfu.edu
>
> You can reach the person managing the list at
> gretl-users-owner(a)lists.wfu.edu
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Gretl-users digest..."
>
>
> Today's Topics:
>
> 1. retrieving F-stat and p-value from a VAR system (cociuba mihai)
> 2. Re: Constant in log-likelihood and graph of two densities
> together (Allin Cottrell)
> 3. Re: retrieving F-stat and p-value from a VAR system
> (Allin Cottrell)
> 4. Re: retrieving F-stat and p-value from a VAR system
> (Allin Cottrell)
> 5. Implement new criterion for var lag selection
> (Gian Lorenzo Spisso)
> 6. Re: Implement new criterion for var lag selection
> (Riccardo (Jack) Lucchetti)
> 7. Re: Implement new criterion for var lag selection
> (Gian Lorenzo Spisso)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 9 Jul 2013 01:44:11 +0300
> From: cociuba mihai <cociuba(a)gmail.com>
> Subject: [Gretl-users] retrieving F-stat and p-value from a VAR system
> To: gretl-users(a)lists.wfu.edu
> Message-ID:
> <CADSiGnWsNfdNat0ZNGzib+Qg6TsANuRGS3NTPO0id=
> qY36zcXQ(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Dear GRETL users,
> I'm testing Granger causality between inflation and inflation uncertainty
> for 15 countries and I would like to retrieve the result of the Wald test
> in a matrix, the script that I try to run gets stuck at the last step. Any
> suggestion are welcome.
>
> ###hansl###
> open Table_17.3.gdt
> var 10 M1 R --lagselect
> a=2
> b=3
> c=6
> d=8
> #number of rows 4, but the number of F statistics reported in the VAR
> output for #every equations is 3 so maybe I need more?
> scalar T = 4
> #generate the matrix with 4 rows and 2 colums
> matrix F_stat = zeros(T,2)
> #rename the colums
> # is it possible to have also the name of the F test?
> colnames(F_stat, "t-stat p-value")
> loop foreach i a b c d
> var $i M1 R --nc
> F_stat[$i,] = $test ~ $pvalue
> endloop
> print F_stat
> ###end###
>
> Thanks, Mihai
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL:
> http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/c6c136c9/...
>
> ------------------------------
>
> Message: 2
> Date: Mon, 8 Jul 2013 21:31:05 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] Constant in log-likelihood and graph of two
> densities together
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082117570.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Mon, 8 Jul 2013, Alecos Papadopoulos wrote:
>
> > Good evening everybody. I am rather new to Gretl and my questions
> > are probably kindergarten-level, but I could not figure out the
> > answers myself or using Help. So here they are
> >
> > 1) I run maximum likelihood from the script window. I am trying
> > two different and non-nested stochastic specifications. I have to
> > compare and evaluate them by using the value of the maximized
> > log-likelihood. But since they are non-nested, their
> > log-likelihood functions are totally different. So, suddenly, the
> > constants of each log-likelihood, although they play no role in
> > the estimation of the parameters, influence the value of the
> > maximized logl - and they are different constants.
> >
> > If I don't include them in the logl function, then the values of
> > the maximized logl (and the AIC and BIC and HQ criteria) will be
> > misleading for comparison purposes of the two competing stochastic
> > specifications, and currently I am doing the corrections by hand
> > (which I can live with). But it would be nice not to have output
> > that needs such corrections. I tried to include them in the
> > specification of the logl after the "mle logl = " command. But
> > when I tried to include them as, say, "ln(4/sqrt(2/pi))" or
> > "ln(4/sqrt(2/%pi)) I get "syntax error on the command line".
>
> The recommended way of accessing pi = 3.14... in current gretl
> (version 1.9.12) is "$pi", though plain "pi" (deprecated since May
> 2012) will still work; "%pi" will definitely not work. The
> expression
>
> ln(4/sqrt(2/$pi))
>
> is correctly evaluated as 1.612... in current gretl.
>
> > When I calculate them explicitly, say 0.45678 and enter this
> > constant instead, Gretl runs, but the estimation goes astray, and
> > produces different results than when the constant is not included.
> > I suspect that this may have something to do with the fact that I
> > do not specify analytical derivatives, but I really don't know.
> > What am I doing wrong?
>
> The issue of analytical versus numerical derivatives wouldn't seem
> to be relevant to the inclusion or non-inclusion of a constant term
> (which obviously doesn't have a derivative) in the log-likelihood.
> I suppose something else must be wrong here. I think you'll have to
> show us your full script to get useful help.
>
> > 2) Again for comparison purposes, I would want to have in one graph the
> > estimated densities of two series. But when I select two series the
> > "Variable" menu becomes disabled, while in the "View" menu there are
> > various graph options, but not the option to graph the estimated
> > densities of the two series together. Is there a way around this?
>
> This question has come up before. Please see
> http://lists.wfu.edu/pipermail/gretl-users/2013-April/008745.html
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 8 Jul 2013 21:59:16 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] retrieving F-stat and p-value from a VAR
> system
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082142420.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Tue, 9 Jul 2013, cociuba mihai wrote:
>
> > I'm testing Granger causality between inflation and inflation
> > uncertainty for 15 countries and I would like to retrieve the
> > result of the Wald test...
>
> What Wald test? (That is, for what null hypothesis?)
>
> > in a matrix, the script that I try to run gets stuck at the last
> > step. Any suggestion are welcome.
>
> [last step]
> > loop foreach i a b c d
> > var $i M1 R --nc
> > F_stat[$i,] = $test ~ $pvalue
> > endloop
>
> The "var" command in gretl does not supply a $test accessor. In fact
> no model estimation command in gretl does that: the label "test" is
> much too general, given that many sorts of tests might be
> contemplated after estimating a given model (either single-equation
> or multi-equation).
>
> Since a VAR is just a collection of equations related in a certain
> way (identical right-hand sides, specific relation between left-hand
> side variables and right-hand sides), estimated in practice via OLS,
> you can get whatever Wald statistics you want by estimating the
> equations singly via the "ols" command, and using either "omit" or
> "restrict" (which do produce $test and $pvalue).
>
> (I suppose we could generalize the current scalar $Fstat accessor
> for single equation models to a matrix for VARs, but that would
> require some decisions on which F-stats to include and in what
> configuration.)
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 4
> Date: Mon, 8 Jul 2013 22:15:21 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] retrieving F-stat and p-value from a VAR
> system
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082212280.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Mon, 8 Jul 2013, Allin Cottrell wrote:
>
> > On Tue, 9 Jul 2013, cociuba mihai wrote:
> >
> >> I'm testing Granger causality between inflation and inflation
> uncertainty
> >> for 15 countries and I would like to retrieve the result of the Wald
> >> test...
> >
> > What Wald test? (That is, for what null hypothesis?)
>
> OK, in fact clear enough from context. Trivial example of what I
> described in my previous posting:
>
> <hansl>
> open data9-7
> scalar p = 4
> var p PRIME UNEMP
> list RHS = const PRIME(-1 to -p) UNEMP(-1 to -p)
> # first equation: does UNEMP Granger-cause PRIME?
> ols PRIME RHS --quiet
> omit UNEMP(-1 to -p) --quiet --test-only
> eval $test
> eval $pvalue
> # second equation: does PRIME Granger-cause UNEMP?
> ols UNEMP RHS --quiet
> omit PRIME(-1 to -p) --quiet --test-only
> eval $test
> eval $pvalue
> </hansl>
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 9 Jul 2013 13:15:43 +0200
> From: Gian Lorenzo Spisso <glspisso(a)gmail.com>
> Subject: [Gretl-users] Implement new criterion for var lag selection
> To: gretl-users(a)lists.wfu.edu
> Message-ID:
> <CAJ_wB9=
> gLShM2DdET7uk_f0CDBTBRgdcsqj_G_mvhkHE4jcE_w(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi all,
> I would like to implement in GRETL the procedure for lag selection of a VAR
> as specified here:
> http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050 which
> essentialy replace BIC and HQC with a weighted average of the two.
>
> Is there any easy to install package that I could use?
> Otherwise could it be possible to simply reprogram AIC column to show this
> criterion instead? In case can anybody provide a little guidance for the
> process? I am not familiar with gretl programming.
>
> Thank you,
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL:
> http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/f27dfb6a/...
>
> ------------------------------
>
> Message: 6
> Date: Tue, 9 Jul 2013 14:45:28 +0200 (CEST)
> From: "Riccardo (Jack) Lucchetti" <r.lucchetti(a)univpm.it>
> Subject: Re: [Gretl-users] Implement new criterion for var lag
> selection
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.DEB.2.10.1307091444170.13798(a)ec-4.econ.univpm.it>
> Content-Type: text/plain; charset="iso-8859-1"
>
> On Tue, 9 Jul 2013, Gian Lorenzo Spisso wrote:
>
> > Hi all,
> > I would like to implement in GRETL the procedure for lag selection of a
> VAR
> > as specified here:
> > http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050 which
> > essentialy replace BIC and HQC with a weighted average of the two.
> >
> > Is there any easy to install package that I could use?
> > Otherwise could it be possible to simply reprogram AIC column to show
> this
> > criterion instead? In case can anybody provide a little guidance for the
> > process? I am not familiar with gretl programming.
>
> I don't have a subscription to "Applied Economics Journal". Could you
> describe me the proposed method?
>
> -------------------------------------------------------
> Riccardo (Jack) Lucchetti
> Dipartimento di Scienze Economiche e Sociali (DiSES)
>
> Universit? Politecnica delle Marche
> (formerly known as Universit? di Ancona)
>
> r.lucchetti(a)univpm.it
> http://www2.econ.univpm.it/servizi/hpp/lucchetti
> -------------------------------------------------------
>
> ------------------------------
>
> Message: 7
> Date: Tue, 9 Jul 2013 14:58:42 +0200
> From: Gian Lorenzo Spisso <glspisso(a)gmail.com>
> Subject: Re: [Gretl-users] Implement new criterion for var lag
> selection
> To: r.lucchetti(a)univpm.it, Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID:
> <CAJ_wB9ndVjdNygwUYYDQEc9Ad7=+
> Px9q4wDW+orXLza6f28rjw(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Dear Riccardo,
> I attach a screenshot of the relevant part.
> You can see the formulas for the two criterion, and the new criterion
> proposed by Hatemi which simply averages the two. He then goes on and uses
> a Montecarlo simulation to show that this mixed criterion as higher
> probability in picking the right lag.
>
>
> On Tue, Jul 9, 2013 at 2:45 PM, Riccardo (Jack) Lucchetti <
> r.lucchetti(a)univpm.it> wrote:
>
> > On Tue, 9 Jul 2013, Gian Lorenzo Spisso wrote:
> >
> > Hi all,
> >> I would like to implement in GRETL the procedure for lag selection of a
> >> VAR
> >> as specified here:
> >> http://www.tandfonline.com/**doi/pdf/10.1080/**1350485022000041050<
> http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050>which
> >> essentialy replace BIC and HQC with a weighted average of the two.
> >>
> >> Is there any easy to install package that I could use?
> >> Otherwise could it be possible to simply reprogram AIC column to show
> this
> >> criterion instead? In case can anybody provide a little guidance for the
> >> process? I am not familiar with gretl programming.
> >>
> >
> > I don't have a subscription to "Applied Economics Journal". Could you
> > describe me the proposed method?
> >
> > ------------------------------**-------------------------
> > Riccardo (Jack) Lucchetti
> > Dipartimento di Scienze Economiche e Sociali (DiSES)
> >
> > Universit? Politecnica delle Marche
> > (formerly known as Universit? di Ancona)
> >
> > r.lucchetti(a)univpm.it
> > http://www2.econ.univpm.it/**servizi/hpp/lucchetti<
> http://www2.econ.univpm.it/servizi/hpp/lucchetti>
> > ------------------------------**-------------------------
> > _______________________________________________
> > Gretl-users mailing list
> > Gretl-users(a)lists.wfu.edu
> > http://lists.wfu.edu/mailman/listinfo/gretl-users
> >
>
>
>
> --
> Gian Lorenzo Spisso
>
>
> *Phone*: 415-359-4330
> *Skype*: glspisso
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL:
> http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/be5d2ca1/...
> -------------- next part --------------
> A non-text attachment was scrubbed...
> Name: HatemiCriterion.jpg
> Type: image/jpeg
> Size: 96716 bytes
> Desc: not available
> Url :
> http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/be5d2ca1/...
>
> ------------------------------
>
> _______________________________________________
> Gretl-users mailing list
> Gretl-users(a)lists.wfu.edu
> http://lists.wfu.edu/mailman/listinfo/gretl-users
>
> End of Gretl-users Digest, Vol 78, Issue 8
> ******************************************
>
11 years, 6 months
Re: [Gretl-users] Constant in log-likelihood and graph of two densities together (Allin Cottrell)
by Alecos Papadopoulos
GRAPH OF TWO DENSITIES TOGETHER: Thanks for providing the older link.
Although the code there is to plot two densities /consecutively /from
left to right, while what I need to do is to /superimpose/ them - and
this I realize now has the problem of having two different abscissaes
series. Still, I learned something new about handling plots in Gretl.
CONSTANT IN LOG-LIKELIHOOD
The basic code *without the constant in the log-l *is (omitting the
initial part where OLS executes to obtain initial values)
<<
matrix Depv = {LWAGE}
matrix Regrs = {const, EXP, EXP2, WKS, OCC, IND, SOUTH, SMSA,
MS, FEM, UNION, ED, BLK}
matrix cVec = {c0,c1,c2,c3,c4,c5,c6,c7,c8,c9,c10,c11,c12}'
scalar v0 = 1
scalar v1 = 1
scalar v2 = 1
mle logl = check ? -ln(v) - 0.5*(e2hn/v)^2 + ln(cdf(D,l1/sqrt(1+l1^2),
e2hn/omega1, 0) - cdf(D,-l2/sqrt(1+l2^2), e2hn/omega2, 0)):NA
series e2hn = Depv - Regrs*cVec
scalar v = sqrt(v0^2 + v1^2 + v2^2)
scalar l1 = (v2/v1)*(v/v0)
scalar l2 = (v1/v2)*(v/v0)
scalar omega1 = (v*v0/v1)*sqrt(1+ (v2/v0)^2)
scalar omega2 = (v*v0/v2)*sqrt(1+ (v1/v0)^2)
scalar check = (v0>0) && (v1>0) && (v2>0)
params cVec v0 v1 v2
end mle --verbose
>>
and gives final results
<<
--- FINAL VALUES:
loglikelihood = -447.517658694 (steplength = 8.38861e-017)
Parameters: 5.6103 0.029306 -0.00048463 0.0036368 -0.16393
0.083254
-0.058693 0.16568 0.093867 -0.32751 0.10612
0.056644
-0.18925 0.21983 0.26368 0.28759
Gradients: 7.1632e-005 -0.00019598 0.0051691 -0.00055942 5.6355e-005
2.3537e-006
4.7828e-005-7.9492e-006 2.2249e-005-2.2649e-006
2.5091e-005 -0.00029823
5.1958e-006 7.7593e-005 5.9452e-006-2.5091e-005 (norm
5.45e-003)
Tolerance = 1.81899e-012
Function evaluations: 397
Evaluations of gradient: 72
Model 3: ML, using observations 1-595
logl = check ? -ln(v) - 0.5*(e2hn/v)^2 + ln(cdf(D,l1/sqrt(1+l1^2),
e2hn/omega1, 0) - cdf(D,-l2/sqrt(1+l2^2), e2hn/omega2, 0)):NA
Standard errors based on Outer Products matrix
estimate std. error z p-value
----------------------------------------------------------
cVec[1] 5.61026 0.189973 29.53 1.12e-191 ***
cVec[2] 0.0293063 0.00650305 4.507 6.59e-06 ***
cVec[3] -0.000484630 0.000127917 -3.789 0.0002 ***
cVec[4] 0.00363680 0.00253677 1.434 0.1517
cVec[5] -0.163931 0.0372662 -4.399 1.09e-05 ***
cVec[6] 0.0832535 0.0305658 2.724 0.0065 ***
cVec[7] -0.0586933 0.0300906 -1.951 0.0511 *
cVec[8] 0.165683 0.0296335 5.591 2.26e-08 ***
cVec[9] 0.0938665 0.0469460 1.999 0.0456 **
cVec[10] -0.327510 0.0678567 -4.826 1.39e-06 ***
cVec[11] 0.106121 0.0335694 3.161 0.0016 ***
cVec[12] 0.0566442 0.00623447 9.086 1.03e-019 ***
cVec[13] -0.189253 0.0551030 -3.435 0.0006 ***
v0 0.219829 0.0951096 2.311 0.0208 **
v1 0.263683 0.111617 2.362 0.0182 **
v2 0.287589 0.103953 2.767 0.0057 ***
Log-likelihood -447.5177 Akaike criterion 927.0353
Schwarz criterion 997.2523 Hannan-Quinn 954.3796
>>
----------------------------------------------------------
If I specify
mle logl = check ? *ln(4/sqrt(2/$pi))* - ln(v) etc
I get
<<
--- FINAL VALUES:
loglikelihood = 511.673340992 (steplength = 1.6384e-010)
Parameters: 5.6103 0.029306 -0.00048463 0.0036368 -0.16393
0.083254
-0.058694 0.16568 0.093868 -0.32751 0.10612
0.056644
-0.18925 0.21983 0.26368 0.28759
Gradients: -0.00013035 0.0013600 0.052876 -0.0098550 6.4448e-005
-0.00051251
0.00057035-8.1379e-006 -0.00036188 -0.00042025
0.00034582 -0.0013756
0.00053909 -0.0013437 0.00054025 -0.00083513 (norm
1.11e-002)
Tolerance = 1.81899e-012
Function evaluations: 493
Evaluations of gradient: 82
Model 3: ML, using observations 1-595
logl = check ? ln(4/sqrt(2/$pi)) -ln(v) - 0.5*(e2hn/v)^2 +
ln(cdf(D,l1/sqrt(1+l1^2), e2hn/omega1, 0) - cdf(D,-l2/sqrt(1+l2^2),
e2hn/omega2, 0)):NA
Standard errors based on Outer Products matrix
estimate std. error z p-value
----------------------------------------------------------
cVec[1] 5.61027 0.189973 29.53 1.12e-191 ***
cVec[2] 0.0293063 0.00650305 4.507 6.59e-06 ***
cVec[3] -0.000484629 0.000127917 -3.789 0.0002 ***
cVec[4] 0.00363683 0.00253677 1.434 0.1517
cVec[5] -0.163931 0.0372663 -4.399 1.09e-05 ***
cVec[6] 0.0832537 0.0305658 2.724 0.0065 ***
cVec[7] -0.0586936 0.0300906 -1.951 0.0511 *
cVec[8] 0.165683 0.0296335 5.591 2.26e-08 ***
cVec[9] 0.0938678 0.0469461 1.999 0.0456 **
cVec[10] -0.327508 0.0678569 -4.826 1.39e-06 ***
cVec[11] 0.106121 0.0335695 3.161 0.0016 ***
cVec[12] 0.0566442 0.00623448 9.086 1.03e-019 ***
cVec[13] -0.189254 0.0551031 -3.435 0.0006 ***
v0 0.219833 0.0951107 2.311 0.0208 **
v1 0.263678 0.111623 2.362 0.0182 **
v2 0.287587 0.103956 2.766 0.0057 ***
Log-likelihood 511.6733 Akaike criterion -991.3467
Schwarz criterion -921.1297 Hannan-Quinn -964.0024
>>
*COMMENT: **all parameter estimates are very close but the value of the
log-likelihood is positive.*
---------------------------------------------
If I specify mle logl = check ? *0.467355827915218* - ln(v) etc I get
<<
--- FINAL VALUES:
loglikelihood = -172.877055337 (steplength = 5.36871e-021)
Parameters: 5.7197 0.029288 -0.00048358 0.0037060 -0.17731
0.065087
-0.062683 0.16589 0.096647 -0.34367
0.098338 0.054146
-0.18382 2.2828e-008 0.33034 0.42766
Gradients: 0.98941 -16.941 65.400 61.632
-0.42174 0.31255
-0.078953 0.32094 -0.099944 -0.028678
1.6085 23.937
0.80591 6.6747e-005 0.045333 0.0018853 (norm
7.16e-001)
Tolerance = 1.81899e-012
Function evaluations: 502
Evaluations of gradient: 79
Model 5: ML, using observations 1-595
logl = check ? 0.467355827915218 -ln(v) - 0.5*(e2hn/v)^2 +
ln(cdf(D,l1/sqrt(1+l1^2), e2hn/omega1, 0) - cdf(D,-l2/sqrt(1+l2^2),
e2hn/omega2, 0)):NA
Standard errors based on Outer Products matrix
estimate std. error z p-value
----------------------------------------------------------------
cVec[1] 5.71974 0.179086 31.94 7.79e-224 ***
cVec[2] 0.0292883 0.00593066 4.938 7.87e-07 ***
cVec[3] -0.000483577 0.000116778 -4.141 3.46e-05 ***
cVec[4] 0.00370595 0.00245961 1.507 0.1319
cVec[5] -0.177311 0.0358279 -4.949 7.46e-07 ***
cVec[6] 0.0650868 0.0284437 2.288 0.0221 **
cVec[7] -0.0626826 0.0289442 -2.166 0.0303 **
cVec[8] 0.165888 0.0279917 5.926 3.10e-09 ***
cVec[9] 0.0966475 0.0452880 2.134 0.0328 **
cVec[10] -0.343670 0.0618104 -5.560 2.70e-08 ***
cVec[11] 0.0983375 0.0315998 3.112 0.0019 ***
cVec[12] 0.0541457 0.00606580 8.926 4.40e-019 ***
cVec[13] -0.183823 0.0524279 -3.506 0.0005 ***
v0 2.28282e-08 405491 0.0000 1.0000
v1 0.330336 0.0328825 10.05 9.57e-024 ***
v2 0.427663 0.0335927 12.73 3.98e-037 ***
Log-likelihood -172.8771 Akaike criterion 377.7541
Schwarz criterion 447.9711 Hannan-Quinn 405.0984
>>
*COMMENT: **slope coefficients are again comparable and the value of the
likelihood is close to what it should have been if its constant term was
added afterwards. But the estimates of the three variance terms v0 v1 v2
are totally different, the one reaching the specified boundary of the
parameter space (zero). *
Alecos Papadopoulos
Athens University of Economics and Business, Greece
Department of Economics
cell:+30-6945-378680
fax: +30-210-8259763
skype:alecos.papadopoulos
On 9/7/2013 16:00, gretl-users-request(a)lists.wfu.edu wrote:
> Send Gretl-users mailing list submissions to
> gretl-users(a)lists.wfu.edu
>
> To subscribe or unsubscribe via the World Wide Web, visit
> http://lists.wfu.edu/mailman/listinfo/gretl-users
> or, via email, send a message with subject or body 'help' to
> gretl-users-request(a)lists.wfu.edu
>
> You can reach the person managing the list at
> gretl-users-owner(a)lists.wfu.edu
>
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of Gretl-users digest..."
>
>
> Today's Topics:
>
> 1. retrieving F-stat and p-value from a VAR system (cociuba mihai)
> 2. Re: Constant in log-likelihood and graph of two densities
> together (Allin Cottrell)
> 3. Re: retrieving F-stat and p-value from a VAR system
> (Allin Cottrell)
> 4. Re: retrieving F-stat and p-value from a VAR system
> (Allin Cottrell)
> 5. Implement new criterion for var lag selection
> (Gian Lorenzo Spisso)
> 6. Re: Implement new criterion for var lag selection
> (Riccardo (Jack) Lucchetti)
> 7. Re: Implement new criterion for var lag selection
> (Gian Lorenzo Spisso)
>
>
> ----------------------------------------------------------------------
>
> Message: 1
> Date: Tue, 9 Jul 2013 01:44:11 +0300
> From: cociuba mihai <cociuba(a)gmail.com>
> Subject: [Gretl-users] retrieving F-stat and p-value from a VAR system
> To: gretl-users(a)lists.wfu.edu
> Message-ID:
> <CADSiGnWsNfdNat0ZNGzib+Qg6TsANuRGS3NTPO0id=qY36zcXQ(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Dear GRETL users,
> I'm testing Granger causality between inflation and inflation uncertainty
> for 15 countries and I would like to retrieve the result of the Wald test
> in a matrix, the script that I try to run gets stuck at the last step. Any
> suggestion are welcome.
>
> ###hansl###
> open Table_17.3.gdt
> var 10 M1 R --lagselect
> a=2
> b=3
> c=6
> d=8
> #number of rows 4, but the number of F statistics reported in the VAR
> output for #every equations is 3 so maybe I need more?
> scalar T = 4
> #generate the matrix with 4 rows and 2 colums
> matrix F_stat = zeros(T,2)
> #rename the colums
> # is it possible to have also the name of the F test?
> colnames(F_stat, "t-stat p-value")
> loop foreach i a b c d
> var $i M1 R --nc
> F_stat[$i,] = $test ~ $pvalue
> endloop
> print F_stat
> ###end###
>
> Thanks, Mihai
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/c6c136c9/...
>
> ------------------------------
>
> Message: 2
> Date: Mon, 8 Jul 2013 21:31:05 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] Constant in log-likelihood and graph of two
> densities together
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082117570.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Mon, 8 Jul 2013, Alecos Papadopoulos wrote:
>
>> Good evening everybody. I am rather new to Gretl and my questions
>> are probably kindergarten-level, but I could not figure out the
>> answers myself or using Help. So here they are
>>
>> 1) I run maximum likelihood from the script window. I am trying
>> two different and non-nested stochastic specifications. I have to
>> compare and evaluate them by using the value of the maximized
>> log-likelihood. But since they are non-nested, their
>> log-likelihood functions are totally different. So, suddenly, the
>> constants of each log-likelihood, although they play no role in
>> the estimation of the parameters, influence the value of the
>> maximized logl - and they are different constants.
>>
>> If I don't include them in the logl function, then the values of
>> the maximized logl (and the AIC and BIC and HQ criteria) will be
>> misleading for comparison purposes of the two competing stochastic
>> specifications, and currently I am doing the corrections by hand
>> (which I can live with). But it would be nice not to have output
>> that needs such corrections. I tried to include them in the
>> specification of the logl after the "mle logl = " command. But
>> when I tried to include them as, say, "ln(4/sqrt(2/pi))" or
>> "ln(4/sqrt(2/%pi)) I get "syntax error on the command line".
> The recommended way of accessing pi = 3.14... in current gretl
> (version 1.9.12) is "$pi", though plain "pi" (deprecated since May
> 2012) will still work; "%pi" will definitely not work. The
> expression
>
> ln(4/sqrt(2/$pi))
>
> is correctly evaluated as 1.612... in current gretl.
>
>> When I calculate them explicitly, say 0.45678 and enter this
>> constant instead, Gretl runs, but the estimation goes astray, and
>> produces different results than when the constant is not included.
>> I suspect that this may have something to do with the fact that I
>> do not specify analytical derivatives, but I really don't know.
>> What am I doing wrong?
> The issue of analytical versus numerical derivatives wouldn't seem
> to be relevant to the inclusion or non-inclusion of a constant term
> (which obviously doesn't have a derivative) in the log-likelihood.
> I suppose something else must be wrong here. I think you'll have to
> show us your full script to get useful help.
>
>> 2) Again for comparison purposes, I would want to have in one graph the
>> estimated densities of two series. But when I select two series the
>> "Variable" menu becomes disabled, while in the "View" menu there are
>> various graph options, but not the option to graph the estimated
>> densities of the two series together. Is there a way around this?
> This question has come up before. Please see
> http://lists.wfu.edu/pipermail/gretl-users/2013-April/008745.html
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 3
> Date: Mon, 8 Jul 2013 21:59:16 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] retrieving F-stat and p-value from a VAR
> system
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082142420.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Tue, 9 Jul 2013, cociuba mihai wrote:
>
>> I'm testing Granger causality between inflation and inflation
>> uncertainty for 15 countries and I would like to retrieve the
>> result of the Wald test...
> What Wald test? (That is, for what null hypothesis?)
>
>> in a matrix, the script that I try to run gets stuck at the last
>> step. Any suggestion are welcome.
> [last step]
>> loop foreach i a b c d
>> var $i M1 R --nc
>> F_stat[$i,] = $test ~ $pvalue
>> endloop
> The "var" command in gretl does not supply a $test accessor. In fact
> no model estimation command in gretl does that: the label "test" is
> much too general, given that many sorts of tests might be
> contemplated after estimating a given model (either single-equation
> or multi-equation).
>
> Since a VAR is just a collection of equations related in a certain
> way (identical right-hand sides, specific relation between left-hand
> side variables and right-hand sides), estimated in practice via OLS,
> you can get whatever Wald statistics you want by estimating the
> equations singly via the "ols" command, and using either "omit" or
> "restrict" (which do produce $test and $pvalue).
>
> (I suppose we could generalize the current scalar $Fstat accessor
> for single equation models to a matrix for VARs, but that would
> require some decisions on which F-stats to include and in what
> configuration.)
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 4
> Date: Mon, 8 Jul 2013 22:15:21 -0400 (EDT)
> From: Allin Cottrell <cottrell(a)wfu.edu>
> Subject: Re: [Gretl-users] retrieving F-stat and p-value from a VAR
> system
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.LFD.2.10.1307082212280.23324@myrtle>
> Content-Type: TEXT/PLAIN; charset=US-ASCII; format=flowed
>
> On Mon, 8 Jul 2013, Allin Cottrell wrote:
>
>> On Tue, 9 Jul 2013, cociuba mihai wrote:
>>
>>> I'm testing Granger causality between inflation and inflation uncertainty
>>> for 15 countries and I would like to retrieve the result of the Wald
>>> test...
>> What Wald test? (That is, for what null hypothesis?)
> OK, in fact clear enough from context. Trivial example of what I
> described in my previous posting:
>
> <hansl>
> open data9-7
> scalar p = 4
> var p PRIME UNEMP
> list RHS = const PRIME(-1 to -p) UNEMP(-1 to -p)
> # first equation: does UNEMP Granger-cause PRIME?
> ols PRIME RHS --quiet
> omit UNEMP(-1 to -p) --quiet --test-only
> eval $test
> eval $pvalue
> # second equation: does PRIME Granger-cause UNEMP?
> ols UNEMP RHS --quiet
> omit PRIME(-1 to -p) --quiet --test-only
> eval $test
> eval $pvalue
> </hansl>
>
> Allin Cottrell
>
>
> ------------------------------
>
> Message: 5
> Date: Tue, 9 Jul 2013 13:15:43 +0200
> From: Gian Lorenzo Spisso <glspisso(a)gmail.com>
> Subject: [Gretl-users] Implement new criterion for var lag selection
> To: gretl-users(a)lists.wfu.edu
> Message-ID:
> <CAJ_wB9=gLShM2DdET7uk_f0CDBTBRgdcsqj_G_mvhkHE4jcE_w(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Hi all,
> I would like to implement in GRETL the procedure for lag selection of a VAR
> as specified here:
> http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050 which
> essentialy replace BIC and HQC with a weighted average of the two.
>
> Is there any easy to install package that I could use?
> Otherwise could it be possible to simply reprogram AIC column to show this
> criterion instead? In case can anybody provide a little guidance for the
> process? I am not familiar with gretl programming.
>
> Thank you,
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: http://lists.wfu.edu/pipermail/gretl-users/attachments/20130709/f27dfb6a/...
>
> ------------------------------
>
> Message: 6
> Date: Tue, 9 Jul 2013 14:45:28 +0200 (CEST)
> From: "Riccardo (Jack) Lucchetti" <r.lucchetti(a)univpm.it>
> Subject: Re: [Gretl-users] Implement new criterion for var lag
> selection
> To: Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID: <alpine.DEB.2.10.1307091444170.13798(a)ec-4.econ.univpm.it>
> Content-Type: text/plain; charset="iso-8859-1"
>
> On Tue, 9 Jul 2013, Gian Lorenzo Spisso wrote:
>
>> Hi all,
>> I would like to implement in GRETL the procedure for lag selection of a VAR
>> as specified here:
>> http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050 which
>> essentialy replace BIC and HQC with a weighted average of the two.
>>
>> Is there any easy to install package that I could use?
>> Otherwise could it be possible to simply reprogram AIC column to show this
>> criterion instead? In case can anybody provide a little guidance for the
>> process? I am not familiar with gretl programming.
> I don't have a subscription to "Applied Economics Journal". Could you
> describe me the proposed method?
>
> -------------------------------------------------------
> Riccardo (Jack) Lucchetti
> Dipartimento di Scienze Economiche e Sociali (DiSES)
>
> Universit? Politecnica delle Marche
> (formerly known as Universit? di Ancona)
>
> r.lucchetti(a)univpm.it
> http://www2.econ.univpm.it/servizi/hpp/lucchetti
> -------------------------------------------------------
>
> ------------------------------
>
> Message: 7
> Date: Tue, 9 Jul 2013 14:58:42 +0200
> From: Gian Lorenzo Spisso <glspisso(a)gmail.com>
> Subject: Re: [Gretl-users] Implement new criterion for var lag
> selection
> To: r.lucchetti(a)univpm.it, Gretl list <gretl-users(a)lists.wfu.edu>
> Message-ID:
> <CAJ_wB9ndVjdNygwUYYDQEc9Ad7=+Px9q4wDW+orXLza6f28rjw(a)mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
>
> Dear Riccardo,
> I attach a screenshot of the relevant part.
> You can see the formulas for the two criterion, and the new criterion
> proposed by Hatemi which simply averages the two. He then goes on and uses
> a Montecarlo simulation to show that this mixed criterion as higher
> probability in picking the right lag.
>
>
> On Tue, Jul 9, 2013 at 2:45 PM, Riccardo (Jack) Lucchetti <
> r.lucchetti(a)univpm.it> wrote:
>
>> On Tue, 9 Jul 2013, Gian Lorenzo Spisso wrote:
>>
>> Hi all,
>>> I would like to implement in GRETL the procedure for lag selection of a
>>> VAR
>>> as specified here:
>>> http://www.tandfonline.com/**doi/pdf/10.1080/**1350485022000041050<http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050>which
>>> essentialy replace BIC and HQC with a weighted average of the two.
>>>
>>> Is there any easy to install package that I could use?
>>> Otherwise could it be possible to simply reprogram AIC column to show this
>>> criterion instead? In case can anybody provide a little guidance for the
>>> process? I am not familiar with gretl programming.
>>>
>> I don't have a subscription to "Applied Economics Journal". Could you
>> describe me the proposed method?
>>
>> ------------------------------**-------------------------
>> Riccardo (Jack) Lucchetti
>> Dipartimento di Scienze Economiche e Sociali (DiSES)
>>
>> Universit? Politecnica delle Marche
>> (formerly known as Universit? di Ancona)
>>
>> r.lucchetti(a)univpm.it
>> http://www2.econ.univpm.it/**servizi/hpp/lucchetti<http://www2.econ.univpm.it/servizi/hpp/lucchetti>
>> ------------------------------**-------------------------
>> _______________________________________________
>> Gretl-users mailing list
>> Gretl-users(a)lists.wfu.edu
>> http://lists.wfu.edu/mailman/listinfo/gretl-users
>>
>
>
11 years, 6 months
Implement new criterion for var lag selection
by Gian Lorenzo Spisso
Hi all,
I would like to implement in GRETL the procedure for lag selection of a VAR
as specified here:
http://www.tandfonline.com/doi/pdf/10.1080/1350485022000041050 which
essentialy replace BIC and HQC with a weighted average of the two.
Is there any easy to install package that I could use?
Otherwise could it be possible to simply reprogram AIC column to show this
criterion instead? In case can anybody provide a little guidance for the
process? I am not familiar with gretl programming.
Thank you,
11 years, 6 months
retrieving F-stat and p-value from a VAR system
by cociuba mihai
Dear GRETL users,
I'm testing Granger causality between inflation and inflation uncertainty
for 15 countries and I would like to retrieve the result of the Wald test
in a matrix, the script that I try to run gets stuck at the last step. Any
suggestion are welcome.
###hansl###
open Table_17.3.gdt
var 10 M1 R --lagselect
a=2
b=3
c=6
d=8
#number of rows 4, but the number of F statistics reported in the VAR
output for #every equations is 3 so maybe I need more?
scalar T = 4
#generate the matrix with 4 rows and 2 colums
matrix F_stat = zeros(T,2)
#rename the colums
# is it possible to have also the name of the F test?
colnames(F_stat, "t-stat p-value")
loop foreach i a b c d
var $i M1 R --nc
F_stat[$i,] = $test ~ $pvalue
endloop
print F_stat
###end###
Thanks, Mihai
11 years, 6 months
Constant in log-likelihood and graph of two densities together
by Alecos Papadopoulos
Good evening everybody. I am rather new to Gretl and my questions are
probably kindergarten-level, but I could not figure out the answers
myself or using Help. So here they are
1) I run maximum likelihood from the script window. I am trying two
different and non-nested stochastic specifications. I have to compare
and evaluate them by using the value of the maximized log-likelihood.
But since they are non-nested, their log-likelihood functions are
totally different. So, suddenly, the constants of each log-likelihood,
although they play no role in the estimation of the parameters,
influence the value of the maximized logl - and they are different
constants.
If I don't include them in the logl function, then the values of the
maximized logl (and the AIC and BIC and HQ criteria) will be misleading
for comparison purposes of the two competing stochastic specifications,
and currently I am doing the corrections by hand (which I can live with).
But it would be nice not to have output that needs such corrections. I
tried to include them in the specification of the logl after the "mle
logl = " command. But when I tried to include them as, say,
"ln(4/sqrt(2/pi))" or "ln(4/sqrt(2/%pi)) I get "syntax error on the
command line". When I calculate them explicitly, say 0.45678 and enter
this constant instead, Gretl runs, but the estimation goes astray, and
produces different results than when the constant is not included. I
suspect that this may have something to do with the fact that I do not
specify analytical derivatives, but I really don't know. What am I doing
wrong?
2) Again for comparison purposes, I would want to have in one graph the
estimated densities of two series. But when I select two series the
"Variable" menu becomes disabled, while in the "View" menu there are
various graph options, but not the option to graph the estimated
densities of the two series together. Is there a way around this?
Thank you.
--
Alecos Papadopoulos
Athens University of Economics and Business, Greece
Department of Economics
cell:+30-6945-378680
fax: +30-210-8259763
skype:alecos.papadopoulos
11 years, 6 months
heteroscedastistic tobit model
by ROGER MCNEILL
According to the gretl test for normality of residuals in my estimated Tobit model, heteroscedasticity is a problem, meaning the Tobit estimator is inconsistent. Does gretl have an alternative estimator for censored regressions such as a least absolute deviations estimator or an maximum likelihood estimator that is consistent in the face of non normal errors?
Roger McNeill
11 years, 6 months
segementation fault
by Artur Tarassow
Hi all,
I obtain a segmentation fault after defining a simple string.
But first I have to say that I updated to current cvs by applying the
usual linux command:
<terminal>
cvs update -d -P && ./configure --enable-build-doc --enable-gtk3
--enable-openmp --prefix=/usr && make clean -j2 && make -j2 && sudo
make install -j2
<\terminal>
Nevertheless, the built date is shown to be 2013-06-10, but it should
be a more recent version, right?
The segementation fault emerges after running this script:
<hansl>
open denmark
win = $windows
if win = 1
sprintf functions "%s", "C:\Users\artur.tarassow\Dropbox\gretl_script"
else
sprintf functions "%s", "/home/artur/Dropbox/gretl_script"
endif
<\hansl>
Cheers,
Artur
11 years, 6 months