shell_ok question
by oleg_komashko@ukr.net
Dear all,
The User's guide says that
shell_ok on/off are in the GUI
for security issues.
What are typical dangers?
Oleh
9 years, 2 months
posting etiquette
by Allin Cottrell
Just a brief comment on posting to the gretl lists. When you're
starting a new topic, please do this by starting a new email, and
_not_ by replying to a current posting on a different topic but
replacing its subject line (also known as hijacking a thread).
It's easy to start a new posting, the address is
gretl-users(a)lists.wfu.edu .
The downside of hijacking an existing thread is that it messes up
the threaded listing of postings to the list. This is a useful
resource when anyone wants to look back on what was said on a given
topic, but its clarity is diminished by hijacking. See for example
http://lists.wfu.edu/pipermail/gretl-users/2015-October/thread.html
(but there are plenty of other examples not restricted to any one
poster). You see, pipermail knows very well when you're replying to
an existing thread even if you try to disguise the fact!
Allin Cottrell
9 years, 2 months
Optimization Problem gretl vs Excel Solver
by Pindar Os
Hi there,
it's been a while since my last post...how was the gretl conference in
Berlin?)
Today I'm concerned with an optimization problem.
I have the following goal:
Find a linear combination of input series in such a way that
- the resulting series is used to calculate the difference to another
exogenous series
- the variance of the 'difference-series' shall be minimized
- the coefficients are weights for the input series
- there are 2 groups of weights and in each group the weights sum up to 1
and are all positive
(In a next step I also restrict the coeffs so that the weights of the two
groups are identical).
I manged to use gretls numerical methods, somehow. Code see below.
However the results are not as good as the Excel Solver solutions: gretl
gives a small but not the minimal variance.
I want to use gretl since stability analysis is much easier (and I prefer
gretl for such tasks at any rate :-).
Is it a necessary to use analytical derivatives?
At the moment I'm not sure how to implement them in the optimization
procedure...
I also tried to use the simann helping step, but this function ignores all
restrictions already implemented and did not produce valuable results.
I'd be happy to receive some advice.
Cheers
Leon
<hansl>
function scalar fn_Dif_min_Vola_sharesTo100_2gr (matrix *x, list lXlist, \
scalar nGr1, series sComp , series *sProg, series *sDiff)
scalar nParams = rows(x)
x[nGr1] = 1-sumc(x[1:nGr1-1])
if nParams-nGr1<>1
x[nParams] = 1-sumc(x[nGr1+1:nParams-1])
endif
x = (x.<0)? 0 : x
x = (x.>1)? 1: x
series sProg = lincomb(lXlist, x)
series sDiff = sProg-sComp
scalar ret = -sd(sDiff)
return ret
end function
function scalar fn_Dif_min_Vola_equal_2gr (matrix *x, list lXlist,\
scalar nGr1, series sComp , series *sProg, series *sDiff)
scalar nParams = rows(x)
x[nGr1] = 1-sumc(x[1:nGr1-1])
# shares of 1st group to use for 2nd
x[nGr1+1:nParams] = x[1:nGr1]
x = (x.<0)? 0: x
x = (x.>1)? 1: x
series sProg = lincomb(lXlist, x)
series sDiff = sProg-sComp
scalar ret = -sd(sDiff)
return ret
end function
# not run
list xList1 = Input1 Input2 Input3 Input4
list xList2 = Input5 Input6 Input7 Input8
list lData = xList1 xList2
matrix mParams = {0.25; 0.25; 0.25; 0.25; 0.25; 0.25; 0.25; 0.25} # nx1
matrix
x = mParams
series y = lincomb(lData, mParams)
series sDiff = y- sExo
# simann produced incorrect values, since the 0<=x<=1 of the function is
ignored
#u = simann(&x, fn_Dif_min_Vola_sharesTo100_2gr(&x, lData, 4,sExo, &y,
&sDiff), 100)
#print x
u = BFGSmax(&x, fn_Dif_min_Vola_sharesTo100_2gr(&x, lData, 4, sExo, &y,
&sDiff))
print x
<hansl>
9 years, 2 months
transition from cvs to git
by Allin Cottrell
For anyone used to building gretl from the CVS sources, please note
that we making the transition to git.
The CVS repository is still there at present as an insurance policy,
but it won't be updated except in case of emergency, and will
probably be removed before long.
You can find a brief "survival guide" at
http://ricardo.ecn.wfu.edu/~cottrell/gretl-git-basics.html
with info on accessing the repository and translation from cvs
commands to git ones for basic operations.
Allin Cottrell
9 years, 2 months
data command with quiet option
by Annaert Jan
When I run the script:
clear
open fedstl.bin
data isratio --quiet
The data are loaded, but the script halts with an error after the data
command. When I delete the --quiet option, the error is avoided. The same
behavior is observed when opening other databases.
I¹m running gretl 1.10.2 on Mac OS X 10.10.5. I did not experience this
issue in gretl 1.9.91.
Jan Annaert
9 years, 3 months
Holt-Winters package
by Raul Gimeno
Thank you Ignacio for your answer.
Would it be possible to modify your program such that the initial values
correspond either to a fixed value given manually or correspond to the OLS
estimation of the parameters for the whole times series or for half of it as
it is usually the case.
Would it be possible to include both the additive and multiplicative models?
Would you agree to integrate your program within the Gretl menu under
"Variables/Filters". I don't see the point of having a separate program for
it.
Would it be possible to expand your help explanations to allow a "simple
user" to understand better how it works, especially what to do if you do not
want a seasonality parameter.
Is there a possibility to write the variables in your help with subscripts
instead of underscores to increase the readability of the equations?
Thank you for your help
Raul Gimeno
-----Ursprüngliche Nachricht-----
Von: gretl-users-bounces(a)lists.wfu.edu
[mailto:gretl-users-bounces@lists.wfu.edu] Im Auftrag von
gretl-users-request(a)lists.wfu.edu
Gesendet: Dienstag, 29. September 2015 18:00
An: gretl-users(a)lists.wfu.edu
Betreff: Gretl-users Digest, Vol 104, Issue 43
Send Gretl-users mailing list submissions to
gretl-users(a)lists.wfu.edu
To subscribe or unsubscribe via the World Wide Web, visit
http://lists.wfu.edu/mailman/listinfo/gretl-users
or, via email, send a message with subject or body 'help' to
gretl-users-request(a)lists.wfu.edu
You can reach the person managing the list at
gretl-users-owner(a)lists.wfu.edu
When replying, please edit your Subject line so it is more specific than
"Re: Contents of Gretl-users digest..."
Today's Topics:
1. Re: Data Import - non-numeric values (Schaff, Frederik)
2. Re: Data Import - non-numeric values (Riccardo (Jack) Lucchetti)
3. Re: Data Import - non-numeric values (Allin Cottrell)
4. Re: Holt-Winters package (Pedro Ba??o)
5. Re: Holt-Winters package (Ignacio Diaz-Emparanza)
----------------------------------------------------------------------
Message: 1
Date: Mon, 28 Sep 2015 17:32:46 +0000
From: "Schaff, Frederik" <Frederik.Schaff(a)fernuni-hagen.de>
To: Gretl list <gretl-users(a)lists.wfu.edu>
Subject: Re: [Gretl-users] Data Import - non-numeric values
Message-ID:
<94DD4923F1D1534189901CE4E97BB72E34966071(a)Ymir.buerokommunikation.fernuni-ha
gen.de>
Content-Type: text/plain; charset="iso-8859-1"
Hi there,
thanks very much Allen! I'll take the advice to heart. Fortunately in the
case where these garbage values are "created" a part of the analysis
(corresponding to these values) has not been conducted and that is flagged
(in another "non-garbage" variable), so I can post-process these values.
What are the "maximal" values gretl takes as import? +-1e100 and +-1e-100?
Regards
Frederik
-----Urspr?ngliche Nachricht-----
Von: gretl-users-bounces(a)lists.wfu.edu
[mailto:gretl-users-bounces@lists.wfu.edu] Im Auftrag von Allin Cottrell
Gesendet: Sonntag, 27. September 2015 21:02
An: Gretl list
Betreff: Re: [Gretl-users] Data Import - non-numeric values
On Sun, 27 Sep 2015, Allin Cottrell wrote:
> Note that whether [uninitialized] values are taken as "numeric" or not
> will in general depend on the C library in use. But either way they're
> wrong and have to be changed. _If_ you can get such values into gretl
> as numeric, you could fix them via something like:
>
> foo = (abs(foo) > 0 && abs(foo) < 1.0e-100)? NA : foo
>
> where "foo" is the name of the series to be fixed and we're assuming
> that non-zero observations with absolute value less than 10^{-100} are
> garbage. This is not very reliable, however, as it's _possible_ that
> some uninitialized doubles happen to fall in the "normal" range and so
> escape correction.
After a little testing, let me rephrase that: it's more than "possible",
it's highly probable.
I wrote a little test C program which created an array of 2048 "doubles",
uninitialized. For each such value I printed it into a string variable using
sprintf() with the "%g" conversion then tried reading it back into a double
using strtod(). I counted the cases where strtod() raised the ERANGE error:
271 out of 2048. So in this case, at least, the great majority of garbage
values appeared to be
"fine": properly numeric and not subnormal.
So here's a big WARNING: on no account should one let uninitialized values
get printed into a file for use in econometric analysis.
There's no half-way reliable method for clearing them out.
[Just as a footnote: a "subnormal" number (also known as
"denormalized") is one that's too close to zero to be represented as a C
"double" to anything like the usual precision. And there's absolutely no
guarantee that the random bits in an uninitialized double will correspond to
a subnormal number.]
Allin Cottrell
_______________________________________________
Gretl-users mailing list
Gretl-users(a)lists.wfu.edu
http://lists.wfu.edu/mailman/listinfo/gretl-users
------------------------------
Message: 2
Date: Mon, 28 Sep 2015 22:08:18 +0200 (CEST)
From: "Riccardo (Jack) Lucchetti" <r.lucchetti(a)univpm.it>
To: Gretl list <gretl-users(a)lists.wfu.edu>
Subject: Re: [Gretl-users] Data Import - non-numeric values
Message-ID: <alpine.DEB.2.20.1509282206190.15621(a)ec-4.econ.univpm.it>
Content-Type: text/plain; charset="iso-8859-1"; Format="flowed"
On Mon, 28 Sep 2015, Schaff, Frederik wrote:
> Hi there,
>
> thanks very much Allen! I'll take the advice to heart. Fortunately in
> the case where these garbage values are "created" a part of the
> analysis (corresponding to these values) has not been conducted and
> that is flagged (in another "non-garbage" variable), so I can
> post-process these values. What are the "maximal" values gretl takes
> as import? +-1e100 and
> +-1e-100?
If I were you, I'd use a conventional vaule for "missing" (say,
-99999.99999), which would be subsequently easy to convert to a "proper"
missing entry via the gretl "setmiss" command.
-------------------------------------------------------
Riccardo (Jack) Lucchetti
Dipartimento di Scienze Economiche e Sociali (DiSES)
Universit? Politecnica delle Marche
(formerly known as Universit? di Ancona)
r.lucchetti(a)univpm.it
http://www2.econ.univpm.it/servizi/hpp/lucchetti
-------------------------------------------------------
------------------------------
Message: 3
Date: Mon, 28 Sep 2015 16:37:32 -0400 (EDT)
From: Allin Cottrell <cottrell(a)wfu.edu>
To: Gretl list <gretl-users(a)lists.wfu.edu>
Subject: Re: [Gretl-users] Data Import - non-numeric values
Message-ID: <alpine.LFD.2.20.1509281631060.2830(a)myrtle.attlocal.net>
Content-Type: text/plain; charset=US-ASCII; format=flowed
On Mon, 28 Sep 2015, Schaff, Frederik wrote:
> thanks very much Allen! I'll take the advice to heart. Fortunately in
> the case where these garbage values are "created" a part of the
> analysis (corresponding to these values) has not been conducted and
> that is flagged (in another "non-garbage" variable), so I can
> post-process these values. What are the "maximal" values gretl takes
> as import? +-1e100 and +-1e-100?
Gretl accepts the judgment of the C library on numerical underflow or
overflow. On the big side we can be fairly definite: anything less than
1.79769e308 should be fine. On the close-to-zero side numbers greater in
absolute value than 1e-308 should be OK for most C libraries.
Allin
------------------------------
Message: 4
Date: Mon, 28 Sep 2015 21:40:33 +0100
From: Pedro Ba??o <pmab(a)fe.uc.pt>
To: Gretl list <gretl-users(a)lists.wfu.edu>
Subject: Re: [Gretl-users] Holt-Winters package
Message-ID:
<CAMwCGMcA=LiciPpePaCom1TsTVszRDQBcN+=V1NyrheSnquA=A(a)mail.gmail.com>
Content-Type: text/plain; charset=UTF-8
Some time ago I came across a similar problem when using this function for a
class. At the time I made a note to myself saying that the problem was in
the definition of lobs, which I changed to:
scalar lobs=lastobs(y)
I hope this helps
On 28 September 2015 at 15:56, Ignacio Diaz-Emparanza
<ignacio.diaz-emparanza(a)ehu.eus> wrote:
> I have not much time for testing today, but it seems that a correction
> I included to avoid initial conditions very different from the first
> observations of the series is not working well for your data. If you
> change in the HoltWinters package the line
>
> series yh1= (0.85*y<= $yhat && $yhat <=1.15*y) ? $yhat : y
>
> simply to:
>
> series yh1= $yhat
>
> you will have results more similar (but not exactly) to yours (I think
> we are using different initial conditions).
>
>
>
> El 28/09/15 a las 14:23, Raul Gimeno escribi?:
>>
>> Hello
>>
>> I've been using the Holt-Winters package but I cannot replicate my
>> Excel-calculation results with this package.
>> The starting value from the package for the trend is 245 mine is 166.396.
>> By
>> running a regression on the full sample I get completely different
>> results for these starting values, although the same methodology as
>> described in the help description has been used.
>> For replication purposes I send my excel spreadsheet and I would be
>> glad to understand how these starting values have been effectively
>> calculated.
>> Thank you for your help
>> Raul Gimeno
>>
>> **
>>
>>
>> _______________________________________________
>> Gretl-users mailing list
>> Gretl-users(a)lists.wfu.edu
>> http://lists.wfu.edu/mailman/listinfo/gretl-users
>
>
>
> --
> Ignacio D?az-Emparanza
> Departamento de Econom?a Aplicada III (Econometr?a y Estad?stica)
> Universidad del Pa?s Vasco - Euskalherriko Unibertsitatea, UPV/EHU
> Tfno: (+34) 94 601 3732
> http://www.ehu.eus/ea3
>
> _______________________________________________
> Gretl-users mailing list
> Gretl-users(a)lists.wfu.edu
> http://lists.wfu.edu/mailman/listinfo/gretl-users
>
>
>
>
------------------------------
Message: 5
Date: Tue, 29 Sep 2015 12:33:37 +0200
From: Ignacio Diaz-Emparanza <ignacio.diaz-emparanza(a)ehu.eus>
To: gretl-users(a)lists.wfu.edu
Subject: Re: [Gretl-users] Holt-Winters package
Message-ID: <560A6901.80301(a)ehu.eus>
Content-Type: text/plain; charset=utf-8; format=flowed
El 28/09/15 a las 22:40, Pedro Ba??o escribi?:
> Some time ago I came across a similar problem when using this function
> for a class. At the time I made a note to myself saying that the
> problem was in the definition of lobs, which I changed to:
> scalar lobs=lastobs(y)
> I hope this helps
>
Yes, I detected this problem some months ago and included some changes for
treating with missing observations as well.
With respect to Raul problem, apart from the change in line
series yh1= (0.85*y<= $yhat && $yhat <=1.15*y) ? $yhat : y
to:
series yh1= $yhat
which I commented yesterday, I see that the differences in calculations
between Raul's excel functions and this package was because of the different
initial observations. As recently reported for the 'movavg'
command (exponential moving average) I was also ignoring the first
observation of the series. I have corrected this and committed the change
(It is in the staging area until Allin aproval:
http://ricardo.ecn.wfu.edu/gretl/staging_fnfiles/).Now the results are the
same.
I also think we need more flexibility for the initial conditions, I will
work on this.
--
Ignacio D?az-Emparanza
Departamento de Econom?a Aplicada III (Econometr?a y Estad?stica)
Universidad del Pa?s Vasco - Euskalherriko Unibertsitatea, UPV/EHU
Tfno: (+34) 94 601 3732
http://www.ehu.eus/ea3
------------------------------
_______________________________________________
Gretl-users mailing list
Gretl-users(a)lists.wfu.edu
http://lists.wfu.edu/mailman/listinfo/gretl-users
End of Gretl-users Digest, Vol 104, Issue 43
********************************************
9 years, 3 months