Am 04.06.2022 um 13:11 schrieb Riccardo (Jack) Lucchetti:
On Fri, 3 Jun 2022, Cottrell, Allin wrote:
> Maybe, though I gather that for a matrix with a lot more rows
than
> columns the execution time is not that different. Note: we currently
> assess rank using regular QR, by counting the R elements greater than
> some specified "tiny" value.
in Hansl:
<hansl>
R = {}
Q = qrdecomp(mnormal(4)~zeros(4)~mnormal(4) , &R)
scalar r = sumc(abs(diag(R)).>1.0e-12)
print r
</hansl>
OK, thanks, that's what I thought.
I'm noting that in the square-invertible input case the diagonal
elements of R are not necessarily positive with qrdecomp(). Maybe this
could be mentioned in the doc. (The decomposition would be unique if
non-negativity were imposed.)
Next, I'm noting that in the rectangular case m,n with m>n the output Q
is m,n and R is n,n. According to Wikipedia this is the "thin" or
"reduced" QR factorization, which might be mentioned as well. In the
Netlib terminology, this corresponds to A = Q_1 * R, if this source is
preferred. The Matlab "economy-size" convention could be the same thing.
(
https://www.mathworks.com/help/matlab/ref/qr.html)
I'm not really sure what the benefit would be from having column
pivoting, which is relatively easy to do in Hansl anyway (note that P is
a permutation matrix, so it can be more compactly expressed as a set of
indices), although of course numerically speaking lapack is unbeatable.
Could you provide us with a use case where pivoting would be useful?
I'm not sure myself yet, let me think about it some more. What the
Matlab people note is that for sparse inputs the result is much sparser
with permutation (see page linked above).
thanks
sven