On Mon, 5 Oct 2015, Pindar Os wrote:
Hi there,
it's been a while since my last post...how was the gretl conference in
Berlin?)
Both productive and fun (thanks, Sven!).
Today I'm concerned with an optimization problem.
I have the following goal:
Find a linear combination of input series in such a way that
- the resulting series is used to calculate the difference to another
exogenous series
- the variance of the 'difference-series' shall be minimized
- the coefficients are weights for the input series
- there are 2 groups of weights and in each group the weights sum up to 1
and are all positive
(In a next step I also restrict the coeffs so that the weights of the two
groups are identical).
I manged to use gretls numerical methods, somehow. Code see below.
However the results are not as good as the Excel Solver solutions: gretl
gives a small but not the minimal variance.
I want to use gretl since stability analysis is much easier (and I prefer
gretl for such tasks at any rate :-).
Is it a necessary to use analytical derivatives?
I think BFGS ought to be able to do a decent job without that. Not
sure, but I think that rather than just truncating your parameters
that are supposed to be in the range 0 to 1, as in
x = (x.<0)? 0 : x
x = (x.>1)? 1: x
you might do better to use a transformation such as cnorm() which
"naturally" enforces the constraint. That is, let the x's be
unconstrained but apply weights of w[i] = cnorm(x[i]). (You'd then
have to use critical(), the inverse CDF function, to set the nth
parameter such that the weights sum to 1.0. However, if the implied
nth weight was negative, I guess you'd want your function to return
NA).
Allin Cottrell