Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use dependencies in the AutoReparam strategy #2966

Draft
wants to merge 1 commit into
base: dev
Choose a base branch
from
Draft

Conversation

fritzo
Copy link
Member

@fritzo fritzo commented Nov 10, 2021

This attempts to apply LocScaleReparam in AutoReparam only for latent variables whose prior distributions depends on upstream variables. In case the prior has fixed parameters, there is no need to reparametrize, and reparametrizing introduces unnecessary complexity and obfuscation.

This first attempt is blocked by nuances in the interaction of get_dependencies() with AutoReparam(), since the model's latent variables are determined only dynamically, but they want to use dependencies to determine strategy. The fix may involve subtler use of ProvenanceTensor 🤔

@fritzo fritzo added this to the 1.8 release milestone Nov 10, 2021
Comment on lines -203 to -206
# TODO reparametrize only if parameters are variable. We might guess
# based on whether parameters are differentiable, .requires_grad. See
# https://github.com/pyro-ppl/pyro/pull/2824

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This PR resolves this TODO

@fritzo fritzo added the Blocked label Nov 10, 2021
@fritzo fritzo modified the milestones: 1.8 release, 1.9 release Dec 13, 2021
@fritzo fritzo modified the milestones: 1.9 release, 1.10 Mar 18, 2022
@fritzo fritzo modified the milestones: 1.9 release, 1.10 release Feb 7, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant