From 17c71e34fe0b57e5d3d75eeb38d1759e18da1ecc Mon Sep 17 00:00:00 2001 From: Avik Pal Date: Thu, 3 Oct 2024 11:13:31 -0400 Subject: [PATCH] docs: fix external references --- docs/src/basics/autodiff.md | 23 ++++++++++++----------- docs/src/tutorials/code_optimization.md | 2 +- docs/src/tutorials/large_systems.md | 9 +++++---- src/algorithms/extension_algs.jl | 2 +- 4 files changed, 19 insertions(+), 17 deletions(-) diff --git a/docs/src/basics/autodiff.md b/docs/src/basics/autodiff.md index dc3166c6e..d2d01e00b 100644 --- a/docs/src/basics/autodiff.md +++ b/docs/src/basics/autodiff.md @@ -8,24 +8,25 @@ ## Summary of Finite Differencing Backends - - [`AutoFiniteDiff`](@extref ADTypes): Finite differencing using `FiniteDiff.jl`, not - optimal but always applicable. - - [`AutoFiniteDifferences`](@extref ADTypes): Finite differencing using - `FiniteDifferences.jl`, not optimal but always applicable. + - [`AutoFiniteDiff`](@extref ADTypes.AutoFiniteDiff): Finite differencing using + `FiniteDiff.jl`, not optimal but always applicable. + - [`AutoFiniteDifferences`](@extref ADTypes.AutoFiniteDifferences): Finite differencing + using `FiniteDifferences.jl`, not optimal but always applicable. ## Summary of Forward Mode AD Backends - - [`AutoForwardDiff`](@extref ADTypes): The best choice for dense problems. - - [`AutoPolyesterForwardDiff`](@extref ADTypes): Might be faster than - [`AutoForwardDiff`](@extref ADTypes) for large problems. Requires + - [`AutoForwardDiff`](@extref ADTypes.AutoForwardDiff): The best choice for dense + problems. + - [`AutoPolyesterForwardDiff`](@extref ADTypes.AutoPolyesterForwardDiff): Might be faster + than [`AutoForwardDiff`](@extref ADTypes.AutoForwardDiff) for large problems. Requires `PolyesterForwardDiff.jl` to be installed and loaded. ## Summary of Reverse Mode AD Backends - - [`AutoZygote`](@extref ADTypes): The fastest choice for non-mutating array-based (BLAS) - functions. - - [`AutoEnzyme`](@extref ADTypes): Uses `Enzyme.jl` Reverse Mode and works for both - in-place and out-of-place functions. + - [`AutoZygote`](@extref ADTypes.AutoZygote): The fastest choice for non-mutating + array-based (BLAS) functions. + - [`AutoEnzyme`](@extref ADTypes.AutoEnzyme): Uses `Enzyme.jl` Reverse Mode and works for + both in-place and out-of-place functions. !!! tip diff --git a/docs/src/tutorials/code_optimization.md b/docs/src/tutorials/code_optimization.md index ce114961b..b7eb9c174 100644 --- a/docs/src/tutorials/code_optimization.md +++ b/docs/src/tutorials/code_optimization.md @@ -90,7 +90,7 @@ end Allocations are only expensive if they are “heap allocations”. For a more in-depth definition of heap allocations, -[there are many sources online](http://net-informations.com/faq/net/stack-heap.htm). +[there are many sources online](https://net-informations.com/faq/net/stack-heap.htm). But a good working definition is that heap allocations are variable-sized slabs of memory which have to be pointed to, and this pointer indirection costs time. Additionally, the heap has to be managed, and the garbage controllers has to actively keep track of what's on the diff --git a/docs/src/tutorials/large_systems.md b/docs/src/tutorials/large_systems.md index a27105227..17b768288 100644 --- a/docs/src/tutorials/large_systems.md +++ b/docs/src/tutorials/large_systems.md @@ -162,10 +162,11 @@ sparse differentiation! One of the useful companion tools for NonlinearSolve.jl is [ADTypes.jl](https://github.com/SciML/ADTypes.jl) that specifies the interface for sparsity -detection via [`jacobian_sparsity`](@extref ADTypes). This allows for automatic -declaration of Jacobian sparsity types. To see this in action, we can give an example `du` -and `u` and call `jacobian_sparsity` on our function with the example arguments, and it will -kick out a sparse matrix with our pattern, that we can turn into our `jac_prototype`. +detection via [`jacobian_sparsity`](@extref ADTypes.jacobian_sparsity). This allows for +automatic declaration of Jacobian sparsity types. To see this in action, we can give an +example `du` and `u` and call `jacobian_sparsity` on our function with the example +arguments, and it will kick out a sparse matrix with our pattern, that we can turn into our +`jac_prototype`. !!! tip diff --git a/src/algorithms/extension_algs.jl b/src/algorithms/extension_algs.jl index a3555a17b..46ebc8dae 100644 --- a/src/algorithms/extension_algs.jl +++ b/src/algorithms/extension_algs.jl @@ -317,7 +317,7 @@ NLSolversJL(; method, autodiff = nothing) = NLSolversJL(method, autodiff) SpeedMappingJL(; σ_min = 0.0, stabilize::Bool = false, check_obj::Bool = false, orders::Vector{Int} = [3, 3, 2], time_limit::Real = 1000) -Wrapper over [SpeedMapping.jl](https://nicolasl-s.github.io/SpeedMapping.jl) for solving +Wrapper over [SpeedMapping.jl](https://nicolasl-s.github.io/SpeedMapping.jl/) for solving Fixed Point Problems. We allow using this algorithm to solve root finding problems as well. ### Keyword Arguments