diff --git a/docs/src/faq.md b/docs/src/faq.md index baafbdc..1139ae5 100644 --- a/docs/src/faq.md +++ b/docs/src/faq.md @@ -1,11 +1,24 @@ # Frequently Asked Questions +## Supported autodiff backends + +- Forward mode: [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl) +- Reverse mode: all the packages compatible with [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl) + +In the future, we would like to add [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) support. + ## Higher-dimensional arrays -For simplicity, the examples only display functions that work on vectors. -However, arbitrary array sizes are supported. +For simplicity, our examples only display functions that eat and spit out vectors. +However, arbitrary array shapes are supported, as long as the forward _and_ conditions callables return similar arrays. Beware however, sparse arrays will be densified in the differentiation process. +## Scalar input / output + +Functions that eat or spit out a single number are not supported. +The forward _and_ conditions callables need arrays: for example, instead of returning `value` you should return `[value]` (a 1-element `Vector`). +Consider using an `SVector` from [StaticArrays.jl](https://github.com/JuliaArrays/StaticArrays.jl) if you seek increased performance. + ## Multiple inputs / outputs In this package, implicit functions can only take a single input array `x` and output a single output array `y` (plus the additional info `z`). @@ -26,7 +39,7 @@ f(x::ComponentVector) = f(x.a, x.b) The same trick works for multiple outputs. -## Constrained optimization modeling +## Modeling constrained optimization problems To express constrained optimization problems as implicit functions, you might need differentiable projections or proximal operators to write the optimality conditions. See [_Efficient and modular implicit differentiation_](https://arxiv.org/abs/2105.15183) for precise formulations. @@ -37,10 +50,3 @@ In case these operators are too complicated to code them yourself, here are a fe - [ProximalOperators.jl](https://github.com/JuliaFirstOrder/ProximalOperators.jl) An alternative is differentiating through the KKT conditions, which is exactly what [DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) does for JuMP models. - -## Which autodiff backends are supported? - -- Forward mode: ForwardDiff.jl -- Reverse mode: all the packages compatible with [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl) - -In the future, we would like to add [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) support.