-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add MeasureTheory support #46
base: main
Are you sure you want to change the base?
Conversation
Codecov Report
@@ Coverage Diff @@
## main #46 +/- ##
==========================================
- Coverage 92.89% 86.91% -5.99%
==========================================
Files 13 14 +1
Lines 521 558 +37
==========================================
+ Hits 484 485 +1
- Misses 37 73 +36
Continue to review full report at Codecov.
|
Very nice!! I'll find a couple of examples to try out. @mschauer this might also be useful for you, if your PDMP sampler needs a warm-up |
It gets better. Because DynamicHMC's API supports all initialization approaches we might use Pathfinder for, and because SampleChainsDynamicHMC exposes those configuration options untouched, we will be able to do this, which bypasses the Stan-style warm-up entirely and uses Pathfinder's estimate of the metric. using Soss, SampleChains, SampleChainsDynamicHMC
model = ... # define Soss.ConditionalModel
result_pf = pathfinder(model)
config = dynamichmc(
initialization=(;
q=result_pf[2][:, 1], κ=GaussianKineticEnergy(result_pf[1].Σ)
),
warmup_stages=default_warmup_stages(; middle_steps=0, doubling_stages=0),
reporter=ProgressMeterReport(),
)
post = sample(mod, config, 1_000) |
This is so cool. Looks like I was missing some julia> using MeasureTheory
julia> using Pathfinder
julia> pathfinder(Normal() ⊗ Gamma(2,3))
[ Info: Optimized for 7 iterations. Maximum ELBO of -0.02 reached at iteration 1.
(Distributions.MvNormal{Float64, Pathfinder.WoodburyPDMat{Float64, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, Matrix{Float64}, Matrix{Float64}, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, LinearAlgebra.QRCompactWYQ{Float64, Matrix{Float64}, Matrix{Float64}}, LinearAlgebra.UpperTriangular{Float64, Matrix{Float64}}}, Vector{Float64}}(
dim: 2
μ: [0.037802, 1.71518]
Σ: [0.876813 -0.0536098; -0.0536098 0.536895]
)
, [-0.956011 -0.853106 … 0.10737 0.0554134; 1.2142 1.18968 … 1.85447 0.846168], [-2.317, -2.2259, -1.54494, -1.48017, -2.16414])
julia> pathfinder(For(rand(3)) do x InverseGaussian(x,5) end)
[ Info: Optimized for 9 iterations. Maximum ELBO of 0.12 reached at iteration 3.
(Distributions.MvNormal{Float64, Pathfinder.WoodburyPDMat{Float64, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, Matrix{Float64}, Matrix{Float64}, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, LinearAlgebra.QRCompactWYQ{Float64, Matrix{Float64}, Matrix{Float64}}, LinearAlgebra.UpperTriangular{Float64, Matrix{Float64}}}, Vector{Float64}}(
dim: 3
μ: [-1.85876, -1.2139, -0.466543]
Σ: [0.0316455 -0.00529572 -0.00216225; -0.00529572 0.0465599 -0.00650248; -0.00216225 -0.00650248 0.121807]
)
, [-2.02251 -1.48321 … -1.56437 -1.93967; -1.03717 -1.19963 … -1.41102 -1.77201; -0.843345 -0.597198 … -0.844978 -0.741263], [0.348766, -0.759287, 1.31935, -0.638448, -2.66604]) It will be really great to get it working with Soss and Tilde. Thanks for putting this together! |
You're welcome! Currently it returns the variational approximation and draws only in the unconstrained space. This is more useful for initializing MCMC, since the initial point and metric can be passed directly to |
@cscherrer it seems julia> μ = Dirichlet(10, 1.0)
Dirichlet(α = Fill(1.0, 10),)
julia> result = multipathfinder(μ, 1_000; nruns=20)
┌ Warning: Pareto shape k = 0.8 > 0.7. Resulting importance sampling estimates are likely to be unstable.
└ @ PSIS ~/.julia/packages/PSIS/Rf0S3/src/core.jl:268
Multi-path Pathfinder result
runs: 20
draws: 9
Pareto shape diagnostic: 0.8 (bad)
julia> result.fit_distribution_transformed |> typeof
Pushforward{TransformVariables.CallableTransform{TransformVariables.UnitSimplex}, SuperpositionMeasure{Vector{MvNormal{(:μ, :σ), Tuple{Vector{Float64}, Pathfinder.WoodburyPDMat{Float64, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, Matrix{Float64}, Matrix{Float64}, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, LinearAlgebra.QRCompactWYQ{Float64, Matrix{Float64}}, LinearAlgebra.UpperTriangular{Float64, Matrix{Float64}}}}}}}, Static.True}
julia> result.fit_distribution_transformed |> rand
ERROR: MethodError: no method matching rand(::Random._GLOBAL_RNG, ::Type{Float64}, ::SuperpositionMeasure{Vector{MvNormal{(:μ, :σ), Tuple{Vector{Float64}, Pathfinder.WoodburyPDMat{Float64, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, Matrix{Float64}, Matrix{Float64}, LinearAlgebra.Diagonal{Float64, Vector{Float64}}, LinearAlgebra.QRCompactWYQ{Float64, Matrix{Float64}}, LinearAlgebra.UpperTriangular{Float64, Matrix{Float64}}}}}}})
...
julia> result.draws_transformed
1000-element Vector{Vector{Float64}}:
[0.12646688194244224, 0.025977504820487007, 0.012354054767482447, 0.1334736170622469, 0.03246415592570656, 0.11141269659298245, 0.024034133186899103, 0.24475990778479786, 0.1342558624335393, 0.15480118548341593]
[0.0169086648766402, 0.09911231516043292, 0.34416889221132596, 0.10180645019569193, 0.14439126397502686, 0.11713472197731346, 0.08444424836455564, 0.07675136145856229, 0.004312850832038056, 0.010969230948412829]
[0.204562486395576, 0.10418880831568512, 0.14665258035543918, 0.03270907277182203, 0.025198320818369802, 0.08760170721317947, 0.22661983988004777, 0.0454094719713609, 0.08236705364451188, 0.04469065863400787]
[0.01816521614098672, 0.08744121201467416, 0.08194354696600549, 0.21065542194335074, 0.05446274719421834, 0.2462246106035293, 0.08504411121447199, 0.09401263968332611, 0.05085735912825384, 0.07119313511118336]
⋮
[0.05276387864922493, 0.06859177668105768, 0.12000714757911174, 0.013786343525336437, 0.0936220599997711, 0.014944597056716297, 0.15220938553755983, 0.32035991301423006, 0.07951434052610347, 0.08420055743088842]
[0.037353097348125405, 0.1003300343889233, 0.014071365428468915, 0.19619179988878918, 0.002071999284771993, 0.15604889644757544, 0.394789935358081, 0.017367043688102463, 0.02483761912275614, 0.05693820904440616]
[0.092177514123393, 0.03885924399407327, 0.11357006993749118, 0.040281419867287165, 0.0017297788961319048, 0.30301174382535745, 0.08705530131888517, 0.10436044162668617, 0.12720672435099528, 0.09174776205969937]
[0.1036299013468673, 0.0237354567117644, 0.00808528404000608, 0.08094192248237413, 0.012319402714347172, 0.17985596931554573, 0.044975357793939554, 0.09246673694172126, 0.33590939793731694, 0.11808057071611726] |
Pausing development of this until Soss works with HMC again, see cscherrer/Soss.jl#337 |
This PR adds support for calling Pathfinder on
MeasureTheory.AbstractMeasure
s. This indirectly should add support for Soss.jl and Tilde.jl, which both express conditional models as subtypes ofAbstractMeasure
However, this code doesn't currently work for Soss, which uses an outdated version of MeasureTheory. In particular:
MeasureTheory.logdensity
, which is not defined in more recent versions of MeasureTheory.xform
to get transformations, instead ofTransformVariables.as
, which MeasureTheory now uses.cc @cscherrer
Example: