diff --git a/README.md b/README.md index e24717cf..4b1309c6 100644 --- a/README.md +++ b/README.md @@ -63,7 +63,8 @@ If you found this library useful in academic research, please cite: [(arXiv link **Always useful** [Equinox](https://github.com/patrick-kidger/equinox): neural networks and everything not already in core JAX! -[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. +[jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. +[diffraxtra](https://github.com/GalacticDynamics/diffraxtra): extras for `diffrax`; OOP and vectorization. **Deep learning** [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers. diff --git a/docs/index.md b/docs/index.md index 8987a9f7..872cec4b 100644 --- a/docs/index.md +++ b/docs/index.md @@ -52,6 +52,7 @@ Have a look at the [Getting Started](./usage/getting-started.md) page. **Always useful** [Equinox](https://github.com/patrick-kidger/equinox): neural networks and everything not already in core JAX! [jaxtyping](https://github.com/patrick-kidger/jaxtyping): type annotations for shape/dtype of arrays. +[diffraxtra](https://github.com/GalacticDynamics/diffraxtra): extras for `diffrax`; OOP and vectorization. **Deep learning** [Optax](https://github.com/deepmind/optax): first-order gradient (SGD, Adam, ...) optimisers.