BackwardsLinalg.jl

Auto differentiation over linear algebras (a Zygote extension)
Author GiggleLiu
Popularity
29 Stars
Updated Last
1 Year Ago
Started In
February 2019

Backward functions for Linear Algebra

Build Status Codecov

#f03c15 This project is still in progress ...

Backward functions for linear algebras, with GPU support. It is currently ported to Zygote.jl for testing, but these porting codes will be moved to other places (like merging them to Zygote.jl) in the future.

Why we need BackwardsLinalg.jl?

Not only in Julia, but also in well known machine learning packages in python like pytorch, one can hardly find a numerical stable implementations of linear algebra function. This missing piece is crutial to autodiff applications in tensor networks algorithms.

Table of Supported Functions

Note: it will change the default behavior, we are considering not changing the output type (SVD, QR) latter when Zygote is stronger.

  • svd and rsvd (randomized SVD)
  • qr
  • cholesky # Nabla.jl
  • powermethod # we need fixed point methods, trying hard ...
  • eigen # linear BP paper, only symmetric case considered
  • lq # similar to qr
  • pfaffian # find it nowhere, lol

For logdet, det and tr, people can find it in ChainRules.jl and Nabla.jl.

Derivation of adjoint backward functions could be found here.

How to Use

It currently ports into Zygote.jl

using Zygote, BackwardsLinalg

function loss(A)
    M, N = size(A)
    U, S, V = svd(A)
    psi = U[:,1]
    H = randn(ComplexF64, M, M)
    H+=H'
    real(psi'*H*psi)[]
end

a = randn(ComplexF64, 4, 6)
g = loss'(a)

Try something interesting (the backward of TRG code, TensorOperations.jl (as well as patch Jutho/TensorOperations.jl#59) is required.)

julia test/trg.py

Used By Packages