Skip to content

Commit

Permalink
Document single output (#53)
Browse files Browse the repository at this point in the history
* Document single output

* Better titles
  • Loading branch information
gdalle authored May 24, 2023
1 parent cf5ab9d commit 58fa014
Showing 1 changed file with 16 additions and 10 deletions.
26 changes: 16 additions & 10 deletions docs/src/faq.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,24 @@
# Frequently Asked Questions

## Supported autodiff backends

- Forward mode: [ForwardDiff.jl](https://github.com/JuliaDiff/ForwardDiff.jl)
- Reverse mode: all the packages compatible with [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl)

In the future, we would like to add [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) support.

## Higher-dimensional arrays

For simplicity, the examples only display functions that work on vectors.
However, arbitrary array sizes are supported.
For simplicity, our examples only display functions that eat and spit out vectors.
However, arbitrary array shapes are supported, as long as the forward _and_ conditions callables return similar arrays.
Beware however, sparse arrays will be densified in the differentiation process.

## Scalar input / output

Functions that eat or spit out a single number are not supported.
The forward _and_ conditions callables need arrays: for example, instead of returning `value` you should return `[value]` (a 1-element `Vector`).
Consider using an `SVector` from [StaticArrays.jl](https://github.com/JuliaArrays/StaticArrays.jl) if you seek increased performance.

## Multiple inputs / outputs

In this package, implicit functions can only take a single input array `x` and output a single output array `y` (plus the additional info `z`).
Expand All @@ -26,7 +39,7 @@ f(x::ComponentVector) = f(x.a, x.b)

The same trick works for multiple outputs.

## Constrained optimization modeling
## Modeling constrained optimization problems

To express constrained optimization problems as implicit functions, you might need differentiable projections or proximal operators to write the optimality conditions.
See [_Efficient and modular implicit differentiation_](https://arxiv.org/abs/2105.15183) for precise formulations.
Expand All @@ -37,10 +50,3 @@ In case these operators are too complicated to code them yourself, here are a fe
- [ProximalOperators.jl](https://github.com/JuliaFirstOrder/ProximalOperators.jl)

An alternative is differentiating through the KKT conditions, which is exactly what [DiffOpt.jl](https://github.com/jump-dev/DiffOpt.jl) does for JuMP models.

## Which autodiff backends are supported?

- Forward mode: ForwardDiff.jl
- Reverse mode: all the packages compatible with [ChainRules.jl](https://github.com/JuliaDiff/ChainRules.jl)

In the future, we would like to add [Enzyme.jl](https://github.com/EnzymeAD/Enzyme.jl) support.

0 comments on commit 58fa014

Please sign in to comment.