-
Notifications
You must be signed in to change notification settings - Fork 32
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add getmodel
and setmodel
from/to LogDensityFunction
#626
Conversation
setmodel
and getmodel
from/to LogDensityFunction
getmodel
and setmodel
from/to LogDensityFunction
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
ext/DynamicPPLReverseDiffExt.jl
Outdated
@@ -23,4 +25,17 @@ function LogDensityProblemsAD.ADgradient( | |||
) | |||
end | |||
|
|||
function DynamicPPL.setmodel( | |||
f::LogDensityProblemsAD.ReverseDiffLogDensity{L,Nothing}, model::DynamicPPL.Model |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@torfjelde @yebai this will error because ReverseDiffLogDensity
can't be found. It is defined under package extension(https://github.com/tpapp/LogDensityProblemsAD.jl/blob/449e5661bc2667f7bef061e148a6ea5526cbb427/ext/LogDensityProblemsADReverseDiffExt.jl#L25), do you know how to handle it here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You can't dispatch on types from package extensions. IIRC, that is a limitation.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm, yeah this is annoying 😕
Don't really see how can avoid this tbh. One alternative is to just define our own AD log density model which contains the ADTypes.AbstractADType
instead and uses DifferentiationInterface.jl or something. Or we raise an issue in LogDensityProblemsAD.jl to make the structs part of the main package instead of the extension to allow this stuff.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It makes sense to move these structs into the main package.
@yebai @torfjelde I don't think we need this PR now. The motivation of this PR was that we want customized |
@sunxd3 let's still transfer these methods from |
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Pull Request Test Coverage Report for Build 9992511738Details
💛 - Coveralls |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM:)
Just bump patch version + remove unnecessary addition of HypothesisTests.jl
to the test project?
@@ -1,6 +1,7 @@ | |||
[deps] | |||
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f" | |||
DynamicPPL = "366bfd00-2699-11ea-058f-f148b4cae6d8" | |||
HypothesisTests = "09f84164-cd44-5f33-b23f-e6b0d136a0d5" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This seems unintended?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is -- without this, Turing integration tests fail (ref https://github.com/TuringLang/Turing.jl/blob/29a134245b2499d59fa992420eba37ab2b9f5945/test/test_utils/numerical_tests.jl#L6)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But maybe better solution? 🤔
@torfjelde another look? 🙏 |
LGTM:) |
ref: TuringLang/Turing.jl#2231 (comment)