Backward functions for Linear Algebra
This project is still in progress ...
Backward functions for linear algebras, with GPU support.
It is currently ported to
Zygote.jl for testing, but these porting codes will be moved to other places (like merging them to
Zygote.jl) in the future.
Why we need BackwardsLinalg.jl?
Not only in Julia, but also in well known machine learning packages in python like pytorch, one can hardly find a numerical stable implementations of linear algebra function. This missing piece is crutial to autodiff applications in tensor networks algorithms.
Table of Supported Functions
Note: it will change the default behavior, we are considering not changing the output type (SVD, QR) latter when Zygote is stronger.
- svd and rsvd (randomized SVD)
- cholesky # Nabla.jl
- powermethod # we need fixed point methods, trying hard ...
- eigen # linear BP paper, only symmetric case considered
- lq # similar to qr
- pfaffian # find it nowhere, lol
tr, people can find it in
Derivation of adjoint backward functions could be found here.
How to Use
It currently ports into
using Zygote, BackwardsLinalg function loss(A) M, N = size(A) U, S, V = svd(A) psi = U[:,1] H = randn(ComplexF64, M, M) H+=H' real(psi'*H*psi) end a = randn(ComplexF64, 4, 6) g = loss'(a)
Try something interesting (the backward of TRG code,
TensorOperations.jl (as well as patch https://github.com/Jutho/TensorOperations.jl/pull/59) is required.)