Hello, gretl developers.
I'm trying to start a translation of help files into russian on
launchpad.net as it seems to be the most suitable tool to participate
for all familiars with econometrics but not with gettext, linux. cvs,
The problem is that there is already a project for gretl on launchpad
and it is strongly prohibited to start more than one project for a
single program. I cannot contact with Constantine Tsardounis for about
a month, so I think it is time to re-assign that project to someone
else. On the irc-channel of launchpad I was told that
Our admins can re-assign the project to new owners but we'd prefer to
hear from the upstream owners. can you get one of them to submit a
But if nobody from main developers wants to register and do something
at launchpad it is possible to assign this function to me and in that
case a letter in this list will probably be enough.
I have prepared a .po-file for genr_funcs.xml and gretl_commands.xml
with the help of po4a utility and got 1511 strings for translation
(strings a rather big).
Good luck, Ivan Sopov.
P.S. My previous letter about using launchpad for translation is
I would like to ask whether it's worthwhile to enable the OxGauss
functionality in combination with gretl's Ox support. (OxGauss means
that Ox can run many existing Gauss programs.) It seems to me that basic
support would be relatively simple, since only a -g switch is needed; so
for running a Gauss program 'mygauss.prg' gretl would need to call:
<path/to/>oxl -g mygauss.prg
(instead of '<path/to/>oxl myox.ox')
I guess a further issue would be how to pass matrices to what would then
be Gauss code, but note that even without it I think it would already be
useful to be able to do:
store @dotdir/mydata.dat --jmulti
# jmulti's format should be same as Gauss (?)
T = 100; # ugly hardcoding, but not the point here
k = 2;
load datamatrix[T,2] = mydata.dat; # hope OxGauss would find this
for Gauss code (executed via OxGauss) should be introduced, since Gauss
in my view is a little obsolete. But maybe opinions differ on that.
BTW, the background in my case is to build wrappers around the break
test codes of Qu&Perron (Econometrica 2007) or Bai&Perron etc. which are
only available in Gauss, a little lengthy, and also I'm not sure whether
their license would allow porting.
the list has been busy recently with bug reports (and also some feature
requests), and it's absolutely understandable that not all bugs could be
fixed right away. (It still continues to amaze me that a sizeable
proportion of bug reports are addressed immediately by Allin.) So here's
a list of what may still be open issues. After clarification and
discussion I will transfer the remainder of this list into the bug
tracker and feature requests databases.
* icon for "code view" in the function package list window: change from
cogwheel to something more intuitive
* icon of function package list window (and others) in taskbar is only
generic (non-gretl) on linux (self-compiled cvs) -- actually, i don't
get any icons for menu items on linux (as opposed to windows), I suppose
that's a bug of my setup?
* the help about invcdf() says P(X<x), but shouldn't it be P(X<=x)?
* the command 'include myfilenamewith.dots' fails even if
'myfilenamewith.dots.inp' exists (and is in the right place/dir),
apparently because gretl interprets .dots as a filename extension
* variable is being treated as discrete should be made optional rather
than automatic (I thought it was already the case after a discussion
some months/years ago)
* function namespace bug; see
* Estimating one equation with 7 variables and 3 lags with OLS produces
a glitch with one of the variable names: Instead of 'Yield_10yr_1' the
name 'd_Yield_10yr' is printed. (Maybe it has to do with the
underscores in the name?)
* The command 'rmplot' is defined only for the GUI. I (=Ignacio) think
it would be very good if we could use it also in scripts.
* script accessors ($test etc.) for bootstrapped test results
* the exogenous variables in a Johansen test setting aren't reported;
and a warning should be printed that the critical values and p-values
are in general only appropriate in the case without exogenous variables
I am happy to announce that I am successfully managing the gretl
project on OpenSuSE Build Service (OBS).
Now, users of OpenSuSE 11.2 (current version) can easily install gretl.
The main search page is here: http://software.opensuse.org/search. You
will find roadruner:gretl (it's me :).
If any of you wish to help me, just say so :).
The OBS is a system to build software packages for Linux systems and
different processor architectures. I am having some difficulties in
setting up the package to Debian, Ubunto, and OpeSuSE 11.2 x86_64. The
latter is, I suspect, due to the configure script ignoring --libexec
I did not had the chance to test the installation because I am with
x86_64. I am assuming the original installation script, and I did not
make a customized gretl.desktop file.
I hope you like this concept for gretl packaging.
I'd found some strings not marked for translation (Main menu -> Help ->
Check for updates):
"You are running gretl 1.8.6.
This is the current version."
Henrique C. de Andrade
Doutorando em Economia Aplicada
Universidade Federal do Rio Grande do Sul
with respect to analyzing SVARs I've taken a relatively close look at
the (AFAIK quite recent) 'vars' package in R, and also read Jack's doc
on his preliminary gretl svar functions (well I also looked a little at
the functions themselves).
First of all, R's 'vars' package is quite complete and nicely
documented. So my first reaction is: don't bother with reinventing the
wheel yet again and point interested users to that package. I could also
write some wrapper functions (packages) to access them from the gretl side.
Two possible caveats:
1. Bootstrapping the impulse responses appears to be very slow in
R-vars. I compared (roughly) the same setup in Jmulti and in R-vars, and
Jmulti (Gauss in the background) was at least 10 times faster than
R-vars. (Admittedly I ran Jmulti on Windows on a different computer, but
the hardware specs are not too different and 9 seconds vs. 3 minutes is
drastic enough for me.)
2. R-vars relies on R-urca from the same author to specify the VECM (for
SVEC modeling), and AFAICS in R-urca it's not possible to place such
elaborate restrictions on the cointegration relations as gretl (remember
the Boswijk/Doornik paper) or Jmulti can do.
@1: In general I would tend to think it's better to help R-vars get
faster instead of developing a competing implementation. (Better to
collaborate than to multiply the needed efforts.) But I don't know if
that's feasible, I really don't know much about R or about the
implementation in R-vars. Also, since Jack has already done quite an
amount of work on this, gretl wouldn't start from scratch, so maybe
there wouldn't be that much duplicate effort (not counting the sunk
effort, of course).
@2: For me this seems a more serious issue, but that's of course driven
by my personal preferences. In my view modeling the cointegration space
properly for non-trivial setups often requires placing restrictions on
the cointegration space. So the fact that this feature is missing from
R-vars could be a reason to do a gretl-native implementation. (Not sure
wether native would necessarily mean c-coding though, pure gretl may be
enough?) OTOH, it could be argued that this is very specialized and
people should stick to Jmulti (or Anders Warne's SVAR program which
actually does much more than that). Jack, AFAICS your functions also do
not allow to do restricted VECMS, right?
In gretl 1.8.0 we made this change:
"The matrices returned by the accessors $sigma and $vcv for VAR
systems now have a degrees of freedom correction."
However, the $sigma accessor for a VECM gives the ML estimator
(although the reported standard errors use a df adjustment).
I feel that our knickers are in a bit of a twist here, and I'm not
sure what the best solution is. In the context of VECM output we
print the cross-equation convariance matrix (accessed by $sigma)
and report its log-determinant. Now, the log-determinant as it
enters the likelihood calculation is obviously based on the ML
estimator, so it seemed to me problematic to print a different
matrix, and equally problematic to provide, via $sigma, a matrix
different from the one printed.
This becomes a live issue with the new $vcv accessor for VECMs.
For consistency with the reported standard errors it should
be df-adjusted, but for consistency with $sigma it should not be
adjusted. Urgh! (Right now in CVS $vcv is df-adjusted.)
I think that my (mild) preference would be to use ML values
throughout for VECMs, but as I recall we changed from that for
compatibility with, e.g., PcGive. Any further thoughts on this?
One further point: I've now enabled the $df accessor for VARs and
VECMs, so it's pretty easy for a user to multiply by $T/$df if
needed -- provided, of course, that we're clear on which variant
is provided by $sigma and $vcv.
I have decided to follow Sven's advice: the default for normal Q-Q
plots is now to plot the empirical quantiles against the normal
distribution matched to the sample mean and variance.
If you prefer to have the data standardized, use the --z-scores
option; or if you prefer to plot the empirical quantiles against
those of N(0,1) use the --raw option.
We now have a reasonably full implementation of Q-Q plots. Please
see the help for "qqplot" in current CVS for details.
qqplot with one series argument: Q-Q against normal
qqplot with 2 series arguments: cross-plot of empirical quantiles
/Variable/Normal Q-Q plot
/View/Graph specified vars/Q-Q plot
Options for normal Q-Q plot:
default: plot standardized data against N(0,1)
option --sample-stats: plot raw empirical quantiles against
option: --raw: empirical quantiles against N(0,1)
The last command related to $vcv produces an error in the following script:
vecm 3 1 LRM LRY; IDE(-1); IBO(-1) --crt
matrix sigmamat = $sigma
matrix omegamat = $vcv
According to the log of backward-incompatible changes (a very useful log
I must say... ;-):
"6/12/2008 Version 1.7.5 ... In the case of VARs/VECMs, $vcv formerly
referred to the cross-equation covariance of the residuals. Now $sigma
is used forthat purpose; $vcv gets the variance of the coefficients,
which was not previously accessible."
Don't know whether it has always been broken since then, though.