On Mon, 11 Jan 2021, Sven Schreiber wrote:
Am 11.01.2021 um 19:38 schrieb Allin Cottrell:
> Please try the test under "data-io" at
Sorry, observing this on Windows here I just did a much cruder
comparison: Loading the same dataset from gdtb (new I think)
If by "new" you mean the new pure-binary format, a gretl data file
will be in that format only if saved via CLI with the --purebin
and from gdt. BTW gdt in this heavily unbalanced case is smaller
at only 18MB compared to 26MB before. I get:
- 6.6s for gdtb
- 14.7s for gdt
So yes, the binary stuff still helps a lot. It was just my expectation
that 26MB somehow "should" be faster loadable than that, but I guess
that was simply too optimistic.
What are the specs of the machine in question? On my 2014-vintage
desktop the 26MB load time is less then a second for both formats
(and 20 milliseconds for the new binary format).
Hm, then I wanted to check what the file size would be without
compression. So I exported to gdtb, setting compression to 0 (all in the
GUI); after a while I get an error window: " .... error zipping" !?
I can't reproduce that, either on Linux or Windows.