QR update, QR insert, and QR delete
It would be nice to be able to update a QR factorization. The library that octave uses is here:
https://sourceforge.net/projects/qrupdate/
for reference. This will allow stuff like quickly updating a LM or GLM model when adding more observations.
Notice that this is already possible for Cholesky (with LinAlg.lowrankupdate
) and that might be sufficient for LMs. For GLMs, I don't think an updated QR will help you much since all the observations are typically reweighted in each iteration.
This is also useful for Anderson acceleration of fixed-point solvers.
@andreasnoack Sorry, I'm not actually interested in using QR updates for LM or GLM, I just thought others might be interested in those applications.
Ref. #2929. Best!
Isn't this a duplicate of #2929 (task 3/3)?
Yes, dup and best to implement in a package given the binary dependency. Could also be a gsoc project.
and this has been done in packages, I think https://github.com/pkofod/Anderson.jl is the right repo to look at (qrtools.jl, at least that's part of it @pkofod ?)
Thanks @ChrisRackauckas. I think that the link you mention only implements a specific case (delete the leftmost column of the matrix under factorization).
I have implemented arbitrary deletion as well as the functionality to also add columns in GeneralQP.jl
(see the UpdatableQR
struct in src/linear_algebra.jl).
The caveat is that UpdatableQR
is not an ''economical'' factorisation, in the sense that the Q generated is a square matrix and all of the columns of Q are stored explicitly. In contrast qrupdate
has methods for updating the factorisation in both economical and full form.
Nevertheless the UpdatableQR
struct of GeneralQP.jl
and the Q-less approach of QRupdate.jl
might cover quite a few use cases.
Thanks! We'll definitely use this in our Anderson acceleration methods. @devmotion @kanav99
For Anderson acceleration we only need those special cases (deleting the left-most column and adding one on the right), so it seems we could use a simpler implementation than the one in https://github.com/oxfordcontrol/GeneralQP.jl/blob/master/src/linear_algebra.jl that is tuned and optimized better towards our needs. I tried to achieve this in https://github.com/JuliaNLSolvers/NLsolve.jl/pull/210.
The Q-less approach was discussed in https://github.com/JuliaNLSolvers/NLsolve.jl/issues/128 but seems to square the conditioning.
Yes, my stuff is specifically for our needs, Good to see something more general
Most helpful comment
Notice that this is already possible for Cholesky (with
LinAlg.lowrankupdate
) and that might be sufficient for LMs. For GLMs, I don't think an updated QR will help you much since all the observations are typically reweighted in each iteration.