Am 15.10.2012 19:58, schrieb mariusz doszyĆ:
Dear Colleagues,
Could you remind me what is the current maximum size of the data set
that could be loaded in gretl? I've to operate on data set for 52000
objects observed in 250 weeks. It would be great to do this in gretl but
I'm not sure if it is possible (probably not), but maybe you could take
this into account in next versions?
Hi,
I'm not the authority on this, but AFAIR gretl is able to handle the
data as long as the machine is "big" enough. So if you have 52K
variables and 250 obs, and each datapoint is stored in a
double-precision float with 8 bytes, I'd say your memory requirements
are 52K*250*8 = roughly 100M, not excessive for today's computers.
(Of course, once you start doing calculations, additional memory will be
needed to store temporary matrices and stuff, but still.)
good luck,
sven