Fast & Extendible
Lux.jl is written in Julia itself, making it extremely extendible. CUDA and AMDGPU are supported first-class, with experimental support for Metal and Intel GPUs.
Elegant & Performant Scientific Machine Learning in JuliaLang
A Pure Julia Deep Learning Framework designed for Scientific Machine Learning
Its easy to install Lux.jl. Since Lux.jl is registered in the Julia General registry, you can simply run the following command in the Julia REPL:
julia> using Pkg
julia> Pkg.add("Lux")
If you want to use the latest unreleased version of Lux.jl, you can run the following command: (in most cases the released version will be same as the version on github)
julia> using Pkg
julia> Pkg.add(url="https://github.com/LuxDL/Lux.jl")
Install the following package(s):
using Pkg
Pkg.add("LuxCUDA")
# or
Pkg.add(["CUDA", "cuDNN"])
using Pkg
Pkg.add("AMDGPU")
using Pkg
Pkg.add("Metal")
using Pkg
Pkg.add("oneAPI")
Run the following to access a device:
using Lux, LuxCUDA
const dev = gpu_device()
using Lux, AMDGPU
const dev = gpu_device()
using Lux, Metal
const dev = gpu_device()
using Lux, oneAPI
const dev = gpu_device()
Install the following package:
using Pkg;
Pkg.add("Reactant")
Run the following to access a device (Reactant automatically selects the best backend by default):
using Reactant, Lux
Reactant.set_default_backend("cpu")
const dev = reactant_device()
using Reactant, Lux
Reactant.set_default_backend("gpu")
const dev = reactant_device()
using Reactant, Lux
Reactant.set_default_backend("tpu")
const dev = reactant_device()