Skip to content

Commit 7fd1640

Browse files
author
AoifeHughes
committed
bump with Mhauru's change suggestions pt2
1 parent 1913cbc commit 7fd1640

File tree

2 files changed

+10
-11
lines changed

2 files changed

+10
-11
lines changed

Project.toml

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -21,6 +21,7 @@ DocStringExtensions = "ffbed154-4ef7-542d-bbb7-c09d3a79fcae"
2121
DynamicPPL = "366bfd00-2699-11ea-058f-f148b4cae6d8"
2222
EllipticalSliceSampling = "cad2338a-1db2-11e9-3401-43bc07c9ede2"
2323
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
24+
JuliaFormatter = "98e50ef6-434e-11e9-1051-2b60c6c9e899"
2425
Libtask = "6f1fad26-d15e-5dc8-ae53-837a1d7b8c9f"
2526
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
2627
LogDensityProblems = "6fdf6af0-433a-55f7-b3ed-c6c6e0b8df7c"
@@ -67,6 +68,7 @@ DynamicHMC = "3.4"
6768
DynamicPPL = "0.37.2"
6869
EllipticalSliceSampling = "0.5, 1, 2"
6970
ForwardDiff = "0.10.3, 1"
71+
JuliaFormatter = "1.0.62"
7072
Libtask = "0.9.3"
7173
LinearAlgebra = "1"
7274
LogDensityProblems = "2"

src/mcmc/hmc.jl

Lines changed: 8 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -115,7 +115,9 @@ This method handles adaptation and warm-up for adaptive Hamiltonian samplers.
115115
116116
- `initial_params`: Initial parameter values for sampling. See `DynamicPPL.initialstep` for details.
117117
118-
Additional keyword arguments are passed to the underlying sampling implementation.
118+
Additional keyword arguments (e.g., `verbose`, `progress`, `chain_type`) are passed to the underlying
119+
sampling implementation. For more information on available options, see the
120+
[sampling options documentation](https://turinglang.org/docs/usage/sampling-options).
119121
120122
# Note
121123
@@ -233,17 +235,12 @@ and performs the first sampling step.
233235
234236
# Keyword Arguments
235237
236-
- `initial_params`: Initial parameter values to use for sampling. If `nothing` (the default),
237-
parameters are resampled from the prior until valid initial values with finite log probability
238-
and gradient are found. If provided, these values are used directly without validation.
239-
Must be in the same format as the model's parameters.
238+
For common keyword arguments like `initial_params` and `verbose`, see the generic
239+
`DynamicPPL.initialstep` documentation.
240240
241241
- `nadapts::Int`: Number of adaptation steps to be performed. Used internally to set up adaptation.
242242
Defaults to `0`.
243243
244-
- `verbose::Bool`: Whether to print informative messages (e.g., the automatically determined step size).
245-
Defaults to `true`.
246-
247244
# Note
248245
249246
If automatic initial parameter search fails after many attempts, an error is raised with
@@ -408,7 +405,7 @@ setting path lengths in Hamiltonian Monte Carlo." Journal of Machine Learning
408405
Research 15, no. 1 (2014): 1593-1623.
409406
"""
410407
struct HMCDA{AD,metricT<:AHMC.AbstractMetric} <: AdaptiveHamiltonian
411-
n_adapts::Int # number of samples with adaption for ϵ
408+
n_adapts::Int # number of samples with adaptation for ϵ
412409
δ::Float64 # target accept rate
413410
λ::Float64 # target leapfrog length
414411
ϵ::Float64 # (initial) step size
@@ -460,7 +457,7 @@ Usage:
460457
461458
```julia
462459
NUTS() # Use default NUTS configuration.
463-
NUTS(1000, 0.65) # Use 1000 adaption steps, and target accept ratio 0.65.
460+
NUTS(1000, 0.65) # Use 1000 adaptation steps, and target accept ratio 0.65.
464461
```
465462
466463
# Arguments
@@ -474,7 +471,7 @@ NUTS(1000, 0.65) # Use 1000 adaption steps, and target accept ratio 0.65.
474471
If not specified, `ForwardDiff` is used, with its `chunksize` automatically determined.
475472
"""
476473
struct NUTS{AD,metricT<:AHMC.AbstractMetric} <: AdaptiveHamiltonian
477-
n_adapts::Int # number of samples with adaption for ϵ
474+
n_adapts::Int # number of samples with adaptation for ϵ
478475
δ::Float64 # target accept rate
479476
max_depth::Int # maximum tree depth
480477
Δ_max::Float64

0 commit comments

Comments
 (0)