-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix for default AutoForwardDiff
#581
Conversation
Should be a fairly simple merge |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Okay so tests are failing because if a suuuper old bug it seems, but I'm now very confused as to why this was hit now rather than any time before? |
@yebai @torfjelde @devmotion turns out the issue is caused by TuringLang/Turing.jl#2170, when run DynamicPPL.jl/src/context_implementations.jl Line 282 in abcf584
DynamicPPL.jl/src/context_implementations.jl Line 292 in abcf584
Stack trace: MethodError: no method matching dot_tilde_assume(::Random.TaskLocalRNG, ::DynamicPPL.IsLeaf, ::PriorContext{Nothing}, ::SampleFromPrior, ::InverseGamma{Float64}, ::Vector{Float64}, ::Vector{VarName{:s, Setfield.IndexLens{Tuple{Int64}}}}, ::UntypedVarInfo{DynamicPPL.Metadata{Dict{VarName, Int64}, Vector{Distribution}, Vector{VarName}, Vector{Real}, Vector{Set{DynamicPPL.Selector}}}, Float64})
Closest candidates are:
dot_tilde_assume(!Matched::AbstractPPL.AbstractContext, ::Any...)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:288
dot_tilde_assume(::Any, !Matched::AbstractPPL.AbstractContext, ::Any...)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:291
dot_tilde_assume(!Matched::DynamicPPL.IsLeaf, ::Any, ::AbstractPPL.AbstractContext, ::Any, ::Any, ::Any, ::Any, ::Any)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:298
...
Closest candidates are:
dot_tilde_assume(::AbstractPPL.AbstractContext, ::Any...)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:288
dot_tilde_assume(::Any, ::AbstractPPL.AbstractContext, ::Any...)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:291
dot_tilde_assume(::DynamicPPL.IsLeaf, ::Any, ::AbstractPPL.AbstractContext, ::Any, ::Any, ::Any, ::Any, ::Any)
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:298
...
Stacktrace:
[1] dot_tilde_assume(::Random.TaskLocalRNG, ::PriorContext{…}, ::SampleFromPrior, ::InverseGamma{…}, ::Vector{…}, ::Vector{…}, ::UntypedVarInfo{…})
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:292
[2] dot_tilde_assume(context::SamplingContext{…}, right::InverseGamma{…}, left::Vector{…}, vn::Vector{…}, vi::UntypedVarInfo{…})
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:282
[3] dot_tilde_assume!!(context::SamplingContext{…}, right::InverseGamma{…}, left::Vector{…}, vn::Vector{…}, vi::UntypedVarInfo{…})
@ DynamicPPL ~/DynamicPPL.jl/src/context_implementations.jl:406
[4] demo_dot_assume_dot_observe(__model__::Model{…}, __varinfo__::UntypedVarInfo{…}, __context__::SamplingContext{…}, x::Vector{…}, arg#360::Type{…})
@ DynamicPPL.TestUtils ~/DynamicPPL.jl/src/test_utils.jl:340
[5] _evaluate!!(model::Model{…}, varinfo::UntypedVarInfo{…}, context::SamplingContext{…})
@ DynamicPPL ~/DynamicPPL.jl/src/model.jl:963
[6] evaluate_threadunsafe!!(model::Model{…}, varinfo::UntypedVarInfo{…}, context::SamplingContext{…})
@ DynamicPPL ~/DynamicPPL.jl/src/model.jl:936
[7] evaluate!!(model::Model{…}, varinfo::UntypedVarInfo{…}, context::SamplingContext{…})
@ DynamicPPL ~/DynamicPPL.jl/src/model.jl:889
[8] step(rng::Random.TaskLocalRNG, model::Model{…}, sampler::Sampler{…}, state::Nothing; kwargs::@Kwargs{})
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:137
[9] step(rng::Random.TaskLocalRNG, model::Model{…}, sampler::Sampler{…}, state::Nothing)
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:130
[10] macro expansion
@ ~/.julia/packages/AbstractMCMC/YrmkI/src/sample.jl:130 [inlined]
[11] macro expansion
@ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:328 [inlined]
[12] macro expansion
@ ~/.julia/packages/AbstractMCMC/YrmkI/src/logging.jl:9 [inlined]
[13] mcmcsample(rng::Random.TaskLocalRNG, model::Model{…}, sampler::Sampler{…}, N::Int64; progress::Bool, progressname::String, callback::Nothing, discard_initial::Int64, thinning::Int64, chain_type::Type, initial_state::Nothing, kwargs::@Kwargs{})
@ AbstractMCMC ~/.julia/packages/AbstractMCMC/YrmkI/src/sample.jl:120
[14] sample(rng::Random.TaskLocalRNG, model::Model{…}, sampler::Sampler{…}, N::Int64; chain_type::Type, resume_from::Nothing, initial_state::Nothing, kwargs::@Kwargs{})
@ DynamicPPL ~/DynamicPPL.jl/src/sampler.jl:93
[15] sample(rng::Random.TaskLocalRNG, model::Model{…}, sampler::Sampler{…}, N::Int64)
@ DynamicPPL ~/DynamicPPL.jl/src/sampler.jl:83
[16] sample(rng::Random.TaskLocalRNG, model::Model{…}, alg::Prior, N::Int64; kwargs::@Kwargs{})
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:219
[17] sample(rng::Random.TaskLocalRNG, model::Model{…}, alg::Prior, N::Int64)
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:212
[18] sample(model::Model{…}, alg::Prior, N::Int64; kwargs::@Kwargs{})
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:209
[19] sample(model::Model{…}, alg::Prior, N::Int64)
@ Turing.Inference ~/.julia/dev/Turing/src/mcmc/Inference.jl:203
[20] entry-point I am not familiar with the context implementations to instantly see a solution, but maybe you can. |
The reason it didn't surface is: the CI tests of #571 was run ahead of the merging of TuringLang/Turing.jl#2170. |
Codecov ReportAll modified and coverable lines are covered by tests ✅
Additional details and impacted files@@ Coverage Diff @@
## master #581 +/- ##
==========================================
+ Coverage 84.32% 84.47% +0.14%
==========================================
Files 26 28 +2
Lines 3183 3207 +24
==========================================
+ Hits 2684 2709 +25
+ Misses 499 498 -1 ☔ View full report in Codecov by Sentry. |
By default,
AutoForwardDiff()
results inchunksize
beingnothing
, not0
.This then breaks our
LogDensityProblemsAD.ADgradient
constructor.