Automatic Differentiation
Lux is not an AD package, but it composes well with most of the AD packages available in the Julia ecosystem. This document lists the current level of support for various AD packages in Lux. Additionally, we provide some convenience functions for working with AD.
Overview
AD Package | CPU | GPU | Nested 2nd Order AD | Support Class |
---|---|---|---|---|
ChainRules.jl [1] | ✔️ | ✔️ | ✔️ | Tier I |
Zygote.jl | ✔️ | ✔️ | ✔️ | Tier I |
ForwardDiff.jl | ✔️ | ✔️ | ✔️ | Tier I |
ReverseDiff.jl | ✔️ | ❌ | ❌ | Tier II |
Tracker.jl | ✔️ | ✔️ | ❌ | Tier II |
Enzyme.jl | ✔️ | ❓[2] | ❓[2:1] | Tier II |
Tapir.jl | ❓[2:2] | ❓[2:3] | ❌ | Tier IV |
Diffractor.jl | ❓[2:4] | ❓[2:5] | ❓[2:6] | Tier IV |
Support Class
Tier I: These packages are fully supported and have been tested extensively. Often have special rules to enhance performance. Issues for these backends take the highest priority.
Tier II: These packages are supported and extensively tested but often don't have the best performance. Issues against these backends are less critical, but we fix them when possible. (Some specific edge cases, especially with AMDGPU, are known to fail here)
Tier III: These packages are somewhat tested but expect rough edges. Help us add tests for these backends to get them to Tier II status.
Tier IV: We don't know if these packages currently work with Lux. We'd love to add tests for these backends, but currently these are not our priority.
Index
JVP & JVP Wrappers
jacobian_vector_product(f, backend::AbstractADType, x, u)
Compute the Jacobian-Vector Product
Backends & AD Packages
Supported Backends | Packages Needed |
---|---|
AutoForwardDiff | ForwardDiff.jl |
Warning
Gradient wrt u
in the reverse pass is always dropped.
Arguments
f
: The function to compute the jacobian of.backend
: The backend to use for computing the JVP.x
: The input to the function.u
: An object of the same structure asx
.
Returns
v
: The Jacobian Vector Product.
vector_jacobian_product(f, backend::AbstractADType, x, u)
Compute the Vector-Jacobian Product
Backends & AD Packages
Supported Backends | Packages Needed |
---|---|
AutoZygote | Zygote.jl |
Warning
Gradient wrt u
in the reverse pass is always dropped.
Arguments
f
: The function to compute the jacobian of.backend
: The backend to use for computing the VJP.x
: The input to the function.u
: An object of the same structure asf(x)
.
Returns
v
: The Vector Jacobian Product.
Batched AD
batched_jacobian(f, backend::AbstractADType, x::AbstractArray)
Computes the Jacobian of a function f
with respect to a batch of inputs x
. This expects the following properties for y = f(x)
:
ndims(y) ≥ 2
size(y, ndims(y)) == size(x, ndims(x))
Backends & AD Packages
Supported Backends | Packages Needed |
---|---|
AutoForwardDiff | ForwardDiff.jl |
AutoZygote | Zygote.jl |
Arguments
f
: The function to compute the jacobian of.backend
: The backend to use for computing the jacobian.x
: The input to the function. Must havendims(x) ≥ 2
.
Returns
J
: The Jacobian off
with respect tox
. This will be a 3D Array. If the dimensions ofx
are(N₁, N₂, ..., Nₙ, B)
and ofy
are(M₁, M₂, ..., Mₘ, B)
, thenJ
will be a((M₁ × M₂ × ... × Mₘ), (N₁ × N₂ × ... × Nₙ), B)
Array.
Danger
f(x)
must not be inter-mixing the batch dimensions, else the result will be incorrect. For example, if f
contains operations like batch normalization, then the result will be incorrect.
Nested 2nd Order AD
Consult the manual page on Nested AD for information on nested automatic differentiation.