Utilities
Index
Lux.StatefulLuxLayer
Lux.cpu
Lux.disable_stacktrace_truncation!
Lux.f16
Lux.f32
Lux.f64
Lux.foldl_init
Lux.gpu
Lux.istraining
Lux.multigate
Device Management / Data Transfer
cpu(x)
Transfer x
to CPU.
Warning
This function has been deprecated. Use cpu_device
instead.
gpu(x)
Transfer x
to GPU determined by the backend set using Lux.gpu_backend!
.
Warning
This function has been deprecated. Use gpu_device
instead. Using this function inside performance critical code will cause massive slowdowns due to type inference failure.
Warning
For detailed API documentation on Data Transfer check out the LuxDeviceUtils.jl
Weight Initialization
Warning
For API documentation on Initialization check out the WeightInitializers.jl
Miscellaneous Utilities
foldl_init(op, x)
foldl_init(op, x, init)
Exactly same as foldl(op, x; init)
in the forward pass. But, gives gradients wrt init
in the backward pass.
istraining(::Val{training})
istraining(st::NamedTuple)
Returns true
if training
is true
or if st
contains a training
field with value true
. Else returns false
.
Method undefined if st.training
is not of type Val
.
multigate(x::AbstractArray, ::Val{N})
Split up x
into N
equally sized chunks (along dimension 1
).
Updating Floating Point Precision
By default, Lux uses Float32 for all parameters and states. To update the precision simply pass the parameters / states / arrays into one of the following functions.
f16(m)
Converts the eltype
of m
floating point values to Float16
. Recurses into structs marked with Functors.@functor
.
f32(m)
Converts the eltype
of m
floating point values to Float32
. Recurses into structs marked with Functors.@functor
.
f64(m)
Converts the eltype
of m
floating point values to Float64
. Recurses into structs marked with Functors.@functor
.
Stateful Layer
StatefulLuxLayer(model, ps, st; st_fixed_type = Val(true))
Warning
This is not a Lux.AbstractExplicitLayer
A convenience wrapper over Lux layers which stores the parameters and states internally. Most users should not be using this version. This comes handy when Lux internally uses the @compact
to construct models and in SciML codebases where propagating state might involving Box
ing.
For a motivating example, see the Neural ODE tutorial.
Arguments
model
: A Lux layerps
: The parameters of the layer. This can be set tonothing
, if the user provides the parameters on function callst
: The state of the layer
Keyword Arguments
st_fixed_type
: IfVal(true)
, then the type of thestate
is fixed, i.e.,typeof(last(model(x, ps, st))) == st
. If this is not the case, thenst_fixed_type
must be set toVal(false)
. Ifst_fixed_type
is set toVal(false)
, then type stability is not guaranteed.
Inputs
x
: The input to the layerps
: The parameters of the layer. Optional, defaults tos.ps
Outputs
y
: The output of the layer
Truncated Stacktraces
disable_stacktrace_truncation!(; disable::Bool=true)
An easy way to update TruncatedStacktraces.VERBOSE
without having to load it manually.
Effectively does TruncatedStacktraces.VERBOSE[] = disable