Skip to content

Improving AD type stability.

Compare
Choose a tag to compare
@cpfiffer cpfiffer released this 20 Dec 21:46
· 958 commits to master since this release
bf3494a

This release improves the type stability of automatic differentiation. For more details, see #626. Speeds should be slightly increased across all samplers, though Gibbs sampling experienced a fairly significant performance boost (source).

You can now specify different autodiff methods for each variable. The snippet below shows using ForwardDiff to sample the mean (m) parameter, and using the Flux-based autodiff for the variance (s) parameter:

using Turing

# Define a simple Normal model with unknown mean and variance.
@model gdemo(x, y) = begin
    s ~ InverseGamma(2, 3)
    m ~ Normal(0, sqrt(s))
    x ~ Normal(m, sqrt(s))
    y ~ Normal(m, sqrt(s))
    return s, m
end
c = sample(gdemo(1.5, 2), Gibbs(1000, 
    HMC{Turing.ForwardDiffAD{1}}(2, 0.1, 5, :m), 
    HMC{Turing.FluxTrackerAD}(2, 0.1, 5, :s)))