I need to estimate a "Linear Probability" Specification in a Binary Choice model, using specifically maximum likelihood estimation.
The log-likelihood of such a specification is

loglik = y*log(x.*b) + (1-y)*(1-x.*b)

where y is the scalar DepVar, "x" is the vector of regressors and "b" is the vector of unknown parameters.

The argument of the natural logarithm is not inherently constrained to range in (0,1), so in the iterative maximization process, negative values of x.*b or (1-x.*b) may  be encountered at some of the observations, which is not admissible for the logarithm.
Left as is, with a "catch" preceding the mle command, Gretl ignores the lot and moves to the next round of randomly generated regressors (this is a Monte Carlo study) etc. But there are cases where this happens with all generated samples. Note that the true specification here is indeed the Linear Probability model (underlying error is Uniform).

Is there a way to tell Gretl to continue "insisting" on the sample where at some step it encountered inadmissible values for the logarithm?

I  tried to use functions, like (pseudo-code)

/
[generate X, y]
[initial parameter values through ols]

function series unifll(series y, series X, scalars b)
    if (x.*b <=0) || (x.*b >=1)
        series liky = NA
    else
        series liky =  y*log(x.*b) + (1-y)*(1-x.*b)
    endif
    return liky
end function

catch mle logl = unifll(y, X, scalars b)
params b0 b1 b2 b3
end mle
/

but I get the message " error evaluating 'if' "

I cannot understand why I get this message, so I haven't actually seen what happens with the mle-algorithm in such an approach.

Any suggestions?
-- 
Alecos Papadopoulos
PhD Candidate
Athens University of Economics and Business, Greece
Department of Economics
https://alecospapadopoulos.wordpress.com/