Am 16.01.2016 um 22:22 schrieb Allin Cottrell:
Internally, if an expression specifies a sub-matrix (e.g.
"m[1:1]") its
result is treated as a matrix, even if it's 1x1. However, if a 1x1
result is assigned to a variable of unspecified type, as in the first
"g" line above, it is "cast" to a scalar.
I have absolutely no idea about the internals here (as you know). But
why actually isn't it done the other way around? If the internal default
is matrix, why not choose that default also for an unspecified type
assignment, and then later convert it to scalar if needed?
...
You will get an "incompatible types" error on the last line, since you
stated that g is supposed to be a scalar. Only the "imputed" scalar type
is taken as mutable, and to matrix only.
What about scalar versus series? "x = 5" can also mean either. Is this
also covered by the changes?
thanks,
sven