[taking this to devel instead of users]
Am 13.02.2018 um 18:23 schrieb Allin Cottrell:
On Tue, 13 Feb 2018, Sven Schreiber wrote:
> Am 13.02.2018 um 03:43 schrieb Allin Cottrell:
>> But given how fast Julia is at generating random
floating-point
>> values, it seems to me there should be a real live example not far off.
>
> Yes. Again, I suggest to tackle the SB.gfn package as a benchmark.
Would make an interesting test case.
OK, since I still don't have much experience with Julia I have played
around with Python/Numba on the stationary bootstrap (SB) first. I'm
attaching a horserace between the default SB function from SB.gfn and a
JITted Python/Numpy/Numba implementation of the loop-based version SB_old2.
It turns out Numba is 5x faster for the pure calculations (well,
depending on the problem size due to the fixed overhead), but the
bottleneck is the disk-based transfer of the big random matrix back to
the gretl environment. Here are my results for 39999 replications:
SB 1.3, 2017-11-14 (F. Di Iorio, S. Fachin, A. Tarassow, Riccardo "Jack"
Lucchetti)
Gretl native (SB package)
This took = 6.067643 sec.
Python / Numba pure calculation
This took = 1.323467 sec.
Python / Numba with transfer
This took = 7.564688 sec.
I believe something similar would emerge for Julia. So unless the data
transfer can be organized in some other way I think it's only possible
to beat well-written hansl code when the data reduction step can already
happen on the Python (or Julia) side.
cheers,
sven