-
-
Notifications
You must be signed in to change notification settings - Fork 986
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Misc fixes to contrib.epidemiology #2527
Conversation
@fritzo any hypothesis as to why? |
Yes, I believe there is a bias-variance tradeoff where very small |
# Try to initialize kernel.transforms using kernel.setup(). | ||
if getattr(self.kernel, "transforms", None) is None: | ||
warmup_steps = 0 | ||
self.kernel.setup(warmup_steps, *args, **kwargs) | ||
# Use `kernel.transforms` when available | ||
if hasattr(self.kernel, 'transforms') and self.kernel.transforms is not None: | ||
if getattr(self.kernel, "transforms", None) is not None: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is required because manually calling initialize_model()
below would ignore init_strategy
and attempt to sample from ImproperUniform
. Instead we let kernel.setup()
call initialize_model()
with proper arguments.
@@ -92,16 +93,17 @@ def hook_fn(kernel, *unused): | |||
warmup_steps=args.warmup_steps, | |||
num_samples=args.num_samples, | |||
num_chains=args.num_chains, | |||
mp_context="spawn" if parallel else None, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
unfortunately i've found this to be somewhat buggy (fair number of crashes)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
do you use "forkserver" instead?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no i've used spawn. but i've tried to avoid it...
Phew 😅 thanks for reviewing! |
Addresses #2426
This implements three fixes in preparation for a tutorial notebook:
heuristic()
to a method._heuristic()
.._concat_series()
._RELAX_MIN_VARIANCE
variable, and lowers the value from 0.25 to 0.1, which seems to improve inference. (I may change again in the future, but future changes will be one-line changes rather than 6-line changes across 2 files).Tested
num_chains=1