Skip to content

Automatic Differentiation

Lux is not an AD package, but it composes well with most of the AD packages available in the Julia ecosystem. This document lists the current level of support for various AD packages in Lux. Additionally, we provide some convenience functions for working with AD.

Overview

AD PackageModeCPUGPUTPUNested 2nd Order ADSupport Class
Reactant.jl[1] + Enzyme.jlReverse✔️✔️✔️✔️Tier I
ChainRules.jl[2]Reverse✔️✔️✔️Tier I
Enzyme.jlReverse✔️[3][3:1]Tier I[4]
Zygote.jlReverse✔️✔️✔️Tier I
ForwardDiff.jlForward✔️✔️✔️Tier I
ReverseDiff.jlReverse✔️Tier II
Tracker.jlReverse✔️✔️Tier II
Mooncake.jlReverse[3:2]Tier III
Diffractor.jlForward[3:3][3:4][3:5]Tier III

Recommendations

  • For CPU Usacases:

    1. Use Reactant.jl + Enzyme.jl for the best performance as well as mutation-support. When available, this is the most reliable and fastest option.

    2. Use Zygote.jl for the best performance without Reactant.jl. This is the most reliable and fastest option for CPU for the time-being. (We are working on faster Enzyme support for CPU)

    3. Use Enzyme.jl, if there are mutations in the code and/or Zygote.jl fails.

    4. If Enzyme.jl fails for some reason, (open an issue and) try ReverseDiff.jl (possibly with compiled mode).

  • For GPU Usacases:

    1. Use Reactant.jl + Enzyme.jl for the best performance. This is the most reliable and fastest option, but presently only supports NVIDIA GPU's. AMD GPUs are currently not supported.

    2. Use Zygote.jl for the best performance on non-NVIDIA GPUs. This is the most reliable and fastest non-Reactant.jl option for GPU for the time-being. We are working on supporting Enzyme.jl without Reactant.jl for GPU as well.

  • For TPU Usacases:

    1. Use Reactant.jl. This is the only supported (and fastest) option.

Support Class

  1. Tier I: These packages are fully supported and have been tested extensively. Often have special rules to enhance performance. Issues for these backends take the highest priority.

  2. Tier II: These packages are supported and extensively tested but often don't have the best performance. Issues against these backends are less critical, but we fix them when possible. (Some specific edge cases, especially with AMDGPU, are known to fail here)

  3. Tier III: We don't know if these packages currently work with Lux. We'd love to add tests for these backends, but currently these are not our priority.

Footnotes


  1. Note that Reactant.jl is not really an AD package, but a tool for compiling functions, including the use of EnzymeMLIR for AD via Enzyme.jl. We have first-class support for the usage of Reactant.jl for inference and training when using Enzyme.jl for differentiation. ↩︎

  2. Note that ChainRules.jl is not really an AD package, but we have first-class support for packages that use rrules. ↩︎

  3. This feature is supported downstream, but we don't extensively test it to ensure that it works with Lux. ↩︎ ↩︎ ↩︎ ↩︎ ↩︎ ↩︎

  4. Currently Enzyme outperforms other AD packages in terms of CPU performance. However, there are some edge cases where it might not work with Lux when not using Reactant. We are working on improving the compatibility. Please report any issues you encounter and try Reactant if something fails. ↩︎