On Sun, 27 Sep 2015, Schaff, Frederik wrote:
 Hello everybody,
 I have a data-set with some non-numeric values according to gretl 
 (but not to CSVed). [1]:  Problematic values (examples): 
 4.13108e-312 5.27295e-270 4.3172e-227 1.51922e-184 3.97655e-142 
 1.42149e-144 1.94409e-143 2.79222e-144 (These values stem from 
 uninitialised double variables in my c++ model, I guess). 
Interesting. Some of these numbers are in the "underflow" range where 
their handling by the C-library function strtod() -- which gretl uses 
as its test for numeric vs non-numeric -- is "implementation defined", 
meaning that standard-conformant C libraries may legitimately behave 
differently.
With glibc 2.22 (Arch Linux) the first string above, "4.13108e-312", 
is converted to a non-zero value OK but errno is set to E_RANGE 
("Numerical result out of range") because the result is "subnormal". 
Gretl notices the non-zero errno and takes the value to be 
non-numeric. (All the other values above are converted without errno 
being set.)
We could make our test a little more "forgiving" by flagging an
underflowing value as non-numeric only if errno = E_RANGE _and_ the 
value returned by strtod is zero.
 Is there an option unbeknown to me to simply substitute such 
 non-numerics by Gretl "NA" directly with the import? I.e. make the 
 error messages quiet and gretl process nonetheless? 
No, and I don't really think that's a good idea. The presence of 
non-numeric values in a mostly numeric column generally indicates that 
a data file is broken (whether truly non-numeric values have somehow 
crept in or, as in this case, uninitialized floating-point values have 
been spewed into the file.) And in that case it's up to the user to 
fix the data.
Allin Cottrell