gretl forgets some function packages on opening a dataset
by Sven Schreiber
Hello,
this is with the June-16 snapshot. I had told gretl to look for function
packages in some additional directory, and these packages appeared in
the package list window. Then I opened a dataset, and the additional
packages were gone.
In other words, starting a new gretl "session" seems to erase the
package location memory. Is that a bug or a feature?
thanks,
sven
10 years, 6 months
Error message refinement
by Sven Schreiber
Hi,
just a trivial (?) suggestion: When I erroneously try to stuff a list
into a bundle, I get the somewhat bizarre error message: "Variable b is
of type bundle" -- something I had long known! Perhaps gretl could
instead say "Bundle b cannot hold lists"?
thanks,
sven
10 years, 6 months
matrix_perf
by Hélio Guilherme
Here are my results (Fedora Core 19 x64):
first the shell output:
load_function_package_from_file:
'/home/helio/gretl/functions/matrix_perf.gfn' is already loaded
detect blas: confused, found too many libs!
-----
[helio@localhost gretl]$ ldd /usr/local/bin/gretl_x11
linux-vdso.so.1 => (0x00007fffb9bfe000)
libgtksourceview-3.0.so.1 => /lib64/libgtksourceview-3.0.so.1
(0x00000033d9200000)
libgtk-3.so.0 => /lib64/libgtk-3.so.0 (0x00000033d7800000)
libgdk-3.so.0 => /lib64/libgdk-3.so.0 (0x00000033d7400000)
libatk-1.0.so.0 => /lib64/libatk-1.0.so.0 (0x00000035a7e00000)
libgio-2.0.so.0 => /lib64/libgio-2.0.so.0 (0x00000035a6600000)
libpangocairo-1.0.so.0 => /lib64/libpangocairo-1.0.so.0
(0x00000033d8600000)
libcairo-gobject.so.2 => /lib64/libcairo-gobject.so.2
(0x00000033d7000000)
libpango-1.0.so.0 => /lib64/libpango-1.0.so.0 (0x0000003098c00000)
libcairo.so.2 => /lib64/libcairo.so.2 (0x00000033d8a00000)
libgretl-1.0.so.10 => /usr/local/lib/libgretl-1.0.so.10
(0x00007f87cce3e000)
liblapack.so.3 => /usr/lib64/atlas/liblapack.so.3
(0x00007f87cc5da000)
libblas.so.3 => /lib64/libblas.so.3 (0x000000316e600000)
libgfortran.so.3 => /lib64/libgfortran.so.3 (0x000000327f600000)
libm.so.6 => /lib64/libm.so.6 (0x000000327d600000)
libdl.so.2 => /lib64/libdl.so.2 (0x000000327ca00000)
libz.so.1 => /lib64/libz.so.1 (0x000000327d200000)
libxml2.so.2 => /lib64/libxml2.so.2 (0x0000003283200000)
libgmp.so.10 => /lib64/libgmp.so.10 (0x0000003e4b000000)
libfftw3.so.3 => /lib64/libfftw3.so.3 (0x00000036ca600000)
libcurl.so.4 => /lib64/libcurl.so.4 (0x00000036be600000)
libgdk_pixbuf-2.0.so.0 => /lib64/libgdk_pixbuf-2.0.so.0
(0x00000033d8200000)
libgobject-2.0.so.0 => /lib64/libgobject-2.0.so.0
(0x00000035a5e00000)
libglib-2.0.so.0 => /lib64/libglib-2.0.so.0 (0x00000035a5a00000)
libgomp.so.1 => /lib64/libgomp.so.1 (0x0000003285e00000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x000000327ce00000)
libc.so.6 => /lib64/libc.so.6 (0x000000327c600000)
libgmodule-2.0.so.0 => /lib64/libgmodule-2.0.so.0
(0x00000035a6200000)
libX11.so.6 => /lib64/libX11.so.6 (0x0000003280a00000)
libXi.so.6 => /lib64/libXi.so.6 (0x0000003284600000)
libXfixes.so.3 => /lib64/libXfixes.so.3 (0x0000003284e00000)
libatk-bridge-2.0.so.0 => /lib64/libatk-bridge-2.0.so.0
(0x00000035a9e00000)
libpangoft2-1.0.so.0 => /lib64/libpangoft2-1.0.so.0
(0x0000003099000000)
libfontconfig.so.1 => /lib64/libfontconfig.so.1 (0x0000003284200000)
libXinerama.so.1 => /lib64/libXinerama.so.1 (0x0000003282a00000)
libXrandr.so.2 => /lib64/libXrandr.so.2 (0x0000003282e00000)
libXcursor.so.1 => /lib64/libXcursor.so.1 (0x0000003286200000)
libXcomposite.so.1 => /lib64/libXcomposite.so.1 (0x000000328b200000)
libXdamage.so.1 => /lib64/libXdamage.so.1 (0x0000003289a00000)
libwayland-client.so.0 => /lib64/libwayland-client.so.0
(0x000000328a600000)
libxkbcommon.so.0 => /lib64/libxkbcommon.so.0 (0x000000328ce00000)
libwayland-cursor.so.0 => /lib64/libwayland-cursor.so.0
(0x000000328e200000)
libXext.so.6 => /lib64/libXext.so.6 (0x0000003280e00000)
libffi.so.6 => /lib64/libffi.so.6 (0x000000327f200000)
libselinux.so.1 => /lib64/libselinux.so.1 (0x000000337d200000)
libresolv.so.2 => /lib64/libresolv.so.2 (0x000000327ee00000)
libgthread-2.0.so.0 => /lib64/libgthread-2.0.so.0
(0x00000035a6a00000)
libharfbuzz.so.0 => /lib64/libharfbuzz.so.0 (0x0000003179a00000)
libfreetype.so.6 => /lib64/libfreetype.so.6 (0x0000003281e00000)
libpixman-1.so.0 => /lib64/libpixman-1.so.0 (0x0000003287200000)
libEGL.so.1 => /lib64/libEGL.so.1 (0x000000337e600000)
libpng15.so.15 => /lib64/libpng15.so.15 (0x00000033d6c00000)
libxcb-shm.so.0 => /lib64/libxcb-shm.so.0 (0x0000003288600000)
libxcb-render.so.0 => /lib64/libxcb-render.so.0 (0x0000003288200000)
libxcb.so.1 => /lib64/libxcb.so.1 (0x0000003280600000)
libXrender.so.1 => /lib64/libXrender.so.1 (0x0000003282200000)
libGL.so.1 => /lib64/libGL.so.1 (0x000000337de00000)
librt.so.1 => /lib64/librt.so.1 (0x000000327da00000)
/lib64/ld-linux-x86-64.so.2 (0x000000327c200000)
libf77blas.so.3 => /usr/lib64/atlas/libf77blas.so.3
(0x00007f87cc3ac000)
libcblas.so.3 => /usr/lib64/atlas/libcblas.so.3 (0x000000364d000000)
libquadmath.so.0 => /lib64/libquadmath.so.0 (0x00007f87cc16f000)
libgcc_s.so.1 => /lib64/libgcc_s.so.1 (0x000000327ea00000)
liblzma.so.5 => /lib64/liblzma.so.5 (0x0000003281600000)
libidn.so.11 => /lib64/libidn.so.11 (0x000000329d000000)
libssh2.so.1 => /lib64/libssh2.so.1 (0x00000036be200000)
libssl3.so => /lib64/libssl3.so (0x0000003c22c00000)
libsmime3.so => /lib64/libsmime3.so (0x0000003c22800000)
libnss3.so => /lib64/libnss3.so (0x0000003c22000000)
libnssutil3.so => /lib64/libnssutil3.so (0x0000003c21c00000)
libplds4.so => /lib64/libplds4.so (0x0000003c21400000)
libplc4.so => /lib64/libplc4.so (0x0000003c21000000)
libnspr4.so => /lib64/libnspr4.so (0x0000003c21800000)
libgssapi_krb5.so.2 => /lib64/libgssapi_krb5.so.2
(0x00000036bd200000)
libkrb5.so.3 => /lib64/libkrb5.so.3 (0x00000036bca00000)
libk5crypto.so.3 => /lib64/libk5crypto.so.3 (0x00000036bda00000)
libcom_err.so.2 => /lib64/libcom_err.so.2 (0x0000003290200000)
liblber-2.4.so.2 => /lib64/liblber-2.4.so.2 (0x000000398c800000)
libldap-2.4.so.2 => /lib64/libldap-2.4.so.2 (0x0000003c24000000)
libatspi.so.0 => /lib64/libatspi.so.0 (0x00000035a9600000)
libdbus-1.so.3 => /lib64/libdbus-1.so.3 (0x0000003281200000)
libexpat.so.1 => /lib64/libexpat.so.1 (0x0000003283600000)
libpcre.so.1 => /lib64/libpcre.so.1 (0x000000337ce00000)
libgraphite2.so.3 => /lib64/libgraphite2.so.3 (0x000000328aa00000)
libX11-xcb.so.1 => /lib64/libX11-xcb.so.1 (0x0000003285600000)
libxcb-dri2.so.0 => /lib64/libxcb-dri2.so.0 (0x0000003289600000)
libxcb-xfixes.so.0 => /lib64/libxcb-xfixes.so.0 (0x0000003286600000)
libxcb-shape.so.0 => /lib64/libxcb-shape.so.0 (0x000000328be00000)
libgbm.so.1 => /lib64/libgbm.so.1 (0x000000337e200000)
libwayland-server.so.0 => /lib64/libwayland-server.so.0
(0x0000003287e00000)
libglapi.so.0 => /lib64/libglapi.so.0 (0x000000337ee00000)
libudev.so.1 => /lib64/libudev.so.1 (0x0000003281a00000)
libdrm.so.2 => /lib64/libdrm.so.2 (0x0000003286e00000)
libXau.so.6 => /lib64/libXau.so.6 (0x0000003280200000)
libxcb-glx.so.0 => /lib64/libxcb-glx.so.0 (0x0000003287600000)
libXxf86vm.so.1 => /lib64/libXxf86vm.so.1 (0x0000003283e00000)
libatlas.so.3 => /usr/lib64/atlas/libatlas.so.3 (0x000000364c200000)
libssl.so.10 => /lib64/libssl.so.10 (0x00000036bde00000)
libcrypto.so.10 => /lib64/libcrypto.so.10 (0x00000039a7a00000)
libkrb5support.so.0 => /lib64/libkrb5support.so.0
(0x00000036bd600000)
libkeyutils.so.1 => /lib64/libkeyutils.so.1 (0x00000036bce00000)
libsasl2.so.3 => /lib64/libsasl2.so.3 (0x0000003c23c00000)
libcrypt.so.1 => /lib64/libcrypt.so.1 (0x0000003c22400000)
libfreebl3.so => /lib64/libfreebl3.so (0x0000003c20c00000)
----
gretl versão 1.9.91cvs
Sessão atual: 2014-06-16 22:25
? include matrix_perf.gfn
/home/helio/gretl/functions/matrix_perf.gfn
? matrix_perf(1234)
dgemm experiment 1, variant 1, speed in Gflops
m n k vanilla openmp sysblas
128 128 128 0,76280 3,8224 1,2408
128 128 256 1,4744 4,4971 2,3319
128 128 512 1,5409 4,6394 2,2827
128 128 1024 1,6307 4,7384 2,4370
128 128 2048 1,6366 4,6429 2,3655
result: openmp dominates
dgemm experiment 1, variant 2, speed in Gflops
m n k vanilla openmp sysblas
128 128 128 1,2369 4,2107 2,3447
256 256 128 1,7027 4,2914 2,4852
512 512 128 1,7378 4,3442 2,4599
1024 1024 128 1,7530 4,5483 2,3357
2048 2048 128 1,6727 4,3896 2,4288
result: openmp dominates
dgemm experiment 1, variant 3, speed in Gflops
m n k vanilla openmp sysblas
128 128 128 1,3331 1,6034 1,0660
256 256 256 1,7022 4,6465 2,2175
512 512 512 1,7230 3,6227 2,6052
1024 1024 1024 1,5251 4,3004 2,0975
2048 2048 2048 1,5605 4,1371 2,0563
result: openmp dominates
dgemm experiment 2, variant 1, speed in Gflops
m n k vanilla openmp sysblas
8 8 8 0,54613 0,19646 0,48416
16 8 8 0,81994 0,38813 0,68170
32 8 8 1,0980 0,62199 0,85589
64 8 8 1,3154 0,82638 1,0962
128 8 8 1,4332 1,0507 1,2646
256 8 8 1,4836 1,2024 1,4424
512 8 8 1,5283 1,2608 1,5017
1024 8 8 1,5241 1,3197 1,4720
2048 8 8 1,5281 1,3507 1,5226
4096 8 8 1,5382 1,3684 1,5155
result: vanilla dominates
dgemm experiment 2, variant 2, speed in Gflops
m n k vanilla openmp sysblas
10 2 1000 1,2576 1,1964 1,6339
20 2 1000 1,3761 1,9025 1,8154
40 2 1000 1,2545 1,9819 1,8016
80 2 1000 1,6691 2,2706 2,1917
160 2 1000 1,5171 2,3899 2,4151
320 2 1000 1,6823 2,5837 2,3722
640 2 1000 1,5027 2,8140 2,1026
1280 2 1000 1,4856 2,8152 2,1165
2560 2 1000 1,5764 2,3430 2,0931
5120 2 1000 1,5695 3,4026 2,1107
result: openmp dominates for mnk >= 640000
dgemm experiment 2, variant 3, speed in Gflops
m n k vanilla openmp sysblas
10 10 1000 1,1356 2,6000 1,7212
20 10 1000 1,3239 3,8221 1,5229
40 10 1000 1,2920 3,7631 1,8438
80 10 1000 1,4216 4,1526 2,2323
160 10 1000 1,6549 4,3422 2,4932
320 10 1000 1,6679 4,3360 2,4248
result: openmp dominates
Operating system: Linux (64-bit)
BLAS library: sysblas
Number of processors: 4
OpenMP enabled: yes
Performance summary:
vanilla -
dominates outright in 1 out of 6 tests
openmp -
dominates outright in 4 out of 6 tests
dominates in 1 test(s) for mnk >= 640000
sysblas -
dominates outright in 0 out of 6 tests
10 years, 6 months
CVS news, part 1
by Allin Cottrell
There's quite a lot going on in gretl CVS right now. Some of the new
things will require more time to document, here I'll just mention a
few quick things.
* The 64-bit Windows build now includes the heavily optimized
OpenBLAS library instead of the Netlib "reference" BLAS and Lapack.
There should be a substantial speed-up on some big matrix
operations.
* There's a new function package on the server, matrix_perf, which
runs some tests of matrix multiplication performance. It would be
interesting to hear what results people get from this -- to see if
there are any surprises. The package requires current CVS or
snapshot.
* The matrix_perf package illustrates a new command, "flush" (it's
documented). Run the package and you'll see what I mean. I recommend
taking a look at "flush" if you're author of a function package
(some of) whose functions take a relatively long time to complete;
see if it might work for you.
* Our "CVS" importer can now handle UTF-16 (and in principle UTF-32)
files, as produced by MS apps: we use GLib to recode such material
to UTF-8.
* Windows displaying bundles, or displaying output from a function
that outputs a bundle in the background, now have what I hope people
will agree is a nicer interface.
Allin
10 years, 6 months
our handling of daily data
by Allin Cottrell
Sven has raised the question of the handling of daily data in gretl;
see the threads starting from
http://lists.wfu.edu/pipermail/gretl-users/2014-May/010037.html
I'm glad of that: it's time we clarified what we do now, and what we
should do in future. (But please note, I'm mostly talking here about
5-day financial-market data; other sorts of daily data might require
different handling.)
Sorry, this is long, but I'd encourage those who work with daily
data to read on...
First a minor point in relation to Sven's example: I think the
Bundesbank is in fact unusual in including blank weekends in
business-day data files. At least, that's not the practice of the
Federal Reserve, the Bank of England, the Banque de France, the
Banca d'Italia, the Sveriges Riksbank... (at which point I got tired
of googling).
Anyway, it's (now) easy enough to strip out weekends, which leaves
the more interesting question of how to deal with holidays.
I think it's fair to say:
(a) most econometricians who wish to apply time-series methods to
daily financial market data will, most of the time, want to ignore
holidays as well as weekends, treating the data as if these days did
not exist and the actual trading days formed a continuous series,
but
(b) for some purposes it may be important to be able to recover
information on (e.g.) which days were Mondays or which days followed
holidays.
How are these needs best supported by econometric software? I can
see two possibilities:
(1) The storage for 5-day data includes rows for all Mondays to
Fridays (or even all days as per the Bundesbank) -- hence satisfying
point (b) automatically -- and the software provides a mechanism for
skipping non-trading days on demand when estimating models.
(2) The data storage includes only actual trading days -- hence
satisfying point (a) automatically -- but with a record of their
calendar dates, and the software provides means of retrieving the
information under point (b) on demand.
Currently gretl includes a half-hearted gesture towards approach (1)
but de facto mostly relies on approach (2). Let me explain.
When we first introduced support for daily data I initially assumed
that we'd want to store 5-day data including rows for all relevant
days, with NAs for holidays. So in view of point (a) above I put in
place a mechanism for skipping NAs in daily data when doing OLS. But
this never got properly documented, and it was never extended to
other estimators.
What happened? Well, as we started adding examples of daily data to
the gretl package it became apparent that approach (2) is quite
common in practice. See for example the "djclose" dataset from Stock
and Watson and the Bollerlev-Ghysels exchange-rate returns series
(b-g.gdt). Both of these have non-trading days squeezed out of them;
let's call this "compressed" daily data.
The Bollerlev-Ghysels dataset is not the best example, as the
authors did not record the actual dates of the included
observations, only the starting and ending dates. But djclose will
serve as a test case: although it excludes non-trading days the date
of each observation is recorded in its "marker" string and it's
straightforward to retrieve all the information one might want via
gretl's calendrical functions, as illustrated below.
<hansl>
/* analysis of compressed 5-day data */
open djclose.gdt
# get day of week and "epoch day" number
series wd = weekday($obsmajor, $obsminor, $obsmicro)
series ed = epochday($obsmajor, $obsminor, $obsmicro)
# maybe we want a dummy for Mondays?
series monday = wd == 1
# find the "delta days" between observations
series delta = diff(ed)
# the "standard" delta days in absence of holidays:
# three for Mondays, otherwise one
series std_delta = wd == 1 ? 3 : 1
# create a dummy for days following holidays
series posthol = delta > std_delta
# take a look...
print wd monday delta posthol --byobs
</hansl>
Here's a proposal for regularizing our handling of daily data. In
brief, it's this: scrap our gesture towards what I called approach
(1) above, and beef up our support for approach (2).
Why get rid of the mechanism for automatically skipping NAs in daily
data for OLS? Because it's anomalous that it only works for OLS, it
would be a lot of work to provide this mechanism for all estimators,
and anyway it probably should not be automatic: ignoring NAs when
they're present in the dataset should require some user
intervention.
By beefing up approach (2) I mean providing easy means of converting
between "uncompressed" and "compressed" daily data. We already
support both variants, but (a) given an uncompressed daily sequence
it should be easy for the user to squeeze out NAs if she thinks
that's appropriate for estimation purposes, and (b) it might be
useful in some contexts to be able to reconstitute the full calendar
sequence from a compressed dataset such as djclose.
Such conversion is possible via "low-level" hansl, but not
convenient. I've therefore added the following two things in
CVS/snapshots:
(1) If you apply an "smpl" restriction to a daily dataset, we try to
reconstitute a useable daily time-series. If it has gaps, we record
the specific dates of the included observations. At present this is
subject to two conditions, which are open to discussion.
(i) Define the "delta" of a given daily observation as the epoch day
(= 1 for the first of January in 1 AD) of that observation minus the
epoch day of the previous one. So, for example, in the case of
complete 7-day data the delta will always be 1. With complete 5-day
data the delta will be 3 for Mondays and 1 for Tuesdays through
Fridays. The first condition on converting from "full" data to
something like djclose.gdt (dated daily data with gaps) is that the
maximum daily delta is less than 10.
(ii) The "smpl" restriction in question may involve throwing away
"empty" weekends; this will lose about 2/7 of the observations and
preserve about 5/7. Allowing for this, we then require that the
number of observations in the sub-sample is at least 90 percent of
the maximum possible. Or in other words we're allowing up to 10
percent loss of observations due to holidays. That's generous --
perhaps too generous?
(The point of these restrictions is to avoid "pretending" that a
seriously gappy daily sequence -- much gappier than could be
accounted for by trading holidays -- can be treated as if it were a
continuous time series for econometric purposes.)
(2) Second thing added: a new trope for the "dataset" command,
namely
dataset pad-daily <days-in-week>
This will pad out a dataset such as djclose, adding in NAs for
holidays and (if the days-in-week parameter is 7) for weekends too.
I'm not sure if this second thing is worth keeping and documenting,
but for now it permits a test of the whole apparatus by
round-tripping. Here's an example, supposing we're starting from
data on a complete 7-day calendar, but with empty weekends and
all-NA rows for holidays (as in Sven's Bundesbank data):
<hansl>
open <seven-day-data>
outfile orig.txt --write
print --byobs
outfile --close
smpl --no-missing --permanent
outfile compressed.txt --write
print --byobs
outfile --close
dataset pad-daily 7
outfile reconstructed.txt --write
print --byobs
outfile --close
string diffstr = $(diff orig.txt reconstructed.txt)
printf "diffstr = '%s'\n", diffstr
</hansl>
So if the round trip is successful, diffstr should be empty. Ah, but
with Sven's data it's not quite empty. What's the problem? It's with
the logic of --no-missing, which excludes all rows on which there's
at least one NA. What we really want, to skip holidays, is to
exclude all and only those rows on which all of our daily variables
are NA. That's feasible via raw hansl, but not so convenient. So one
more modification to "smpl" in CVS: add an option --no-all-missing
(the name may be debatable). Substitute --no-all-missing for
--no-missing in the script above and the difference between orig.txt
and reconstructed.txt really is null.
If you don't have a handy Bundesbank-style data file (though it's
not hard to fake one), here's another round-trip test, in the other
direction: we pad out djclose then shrink it again.
<hansl>
open djclose.gdt -q
outfile orig.txt --write
print --byobs
outfile --close
dataset pad-daily 5
outfile padded.txt --write
print --byobs
outfile --close
smpl --no-all-missing --permanent
outfile reconstructed.txt --write
print --byobs
outfile --close
string diffstr = $(diff orig.txt reconstructed.txt)
printf "diffstr = '%s'\n", diffstr
</hansl>
The use of the --permanent option in the round-trip scripts is just
to ensure that all vestiges of the original data are destroyed
before the reconstruction takes place. In "normal usage" one could
just do
<hansl fragment="true">
open <seven-day-data>
smpl --no-all-missing
</hansl>
then carry out econometric analysis without tripping over NAs.
Allin
10 years, 6 months
Function package download stats
by Sven Schreiber
Hi,
I'm wondering, given that there's only one function package server (as
opposed to the various ways you can get gretl itself), are there
download stats for the packages?
Thanks,
sven
10 years, 6 months
data importer improvements
by Allin Cottrell
Current gretl CVS includes several improvements in respect of data
importation. Here are the main points.
1) In relation to a file such as
http://www.ggdc.net/maddison/maddison-project/data/mpd_2013-01.xlsx
there were a few issues raised by Sven in
http://lists.wfu.edu/pipermail/gretl-devel/2014-May/005091.html
This is an historical data file, not a times-series in the usual
sense but with a time dimension. In addition, several column
headings are far from being valid gretl variable names (e.g. they
start with numbers or punctuation) and two of them are missing
altogether. It was a fair amount of work to get this to open at all
in gretl.
Now you can open such a file directly, with a row offset of 2 to
skip the header:
open mpd_2013-01.xlsx --rowoffset=2
The column headings are automatically purged of junk and the missing
ones are filled in with v<number>. Gretl does not treat the dataset
as time-series, but it does import the years in the first column as
observation markers. If you want to treat the data as annual time
series (with many more gaps than data-years) you can now achieve
this with
nulldata 2010
setobs 1 1 --time-series
append mpd_2013-01.xlsx --rowoffset=2
Here we force the issue by creating an annual time series running
from the year 1 to 2010, then importing the Maddison data, whose
observation markers are compatible with the annual dataset
structure.
2) I recently visited FRED and downloaded an xls file containing
daily data on Treasury Bill rates. I noticed that there were a
couple of issues with such files.
i) The daily dates in the first column were not being recognized by
gretl as such, because they don't use a built-in Excel date format.
However, we now guess that if a custom numerical format is used in
column 1 this probably implies dates.
ii) Missing values came into gretl as zeros. This is because FRED
records NAs using the Excel formula NA(). Logical enough, but when
gretl encounters an Excel formula it reads the result that's stored
along with the formula, and in XLS the result stored by NA() is 0.
Nice, not! So now when we get a 0 result from a formula we check to
see if the formula is in fact NA().
There's also a relatively minor third issue: as the xls importer
stood it could produce garbage in place of the name of an xls
worksheet if the name involved "rich text" and/or "extended
characters". Handling of sheet names in seriously non-ASCII cases is
now better but by no means perfect.
Allin
10 years, 6 months