Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Misc fixes to contrib.epidemiology #2527

Merged
merged 4 commits into from
Jun 16, 2020
Merged

Misc fixes to contrib.epidemiology #2527

merged 4 commits into from
Jun 16, 2020

Conversation

fritzo
Copy link
Member

@fritzo fritzo commented Jun 15, 2020

Addresses #2426

This implements three fixes in preparation for a tutorial notebook:

  1. Support parallel chain MCMC. This requires pickle support, so I've moved the local function heuristic() to a method ._heuristic().
  2. Fixes a shape error (and resolves a TODO) in ._concat_series().
  3. Refactors relaxed logic to refer to a global _RELAX_MIN_VARIANCE variable, and lowers the value from 0.25 to 0.1, which seems to improve inference. (I may change again in the future, but future changes will be one-line changes rather than 6-line changes across 2 files).

Tested

  • unit tests for num_chains=1
  • refactoring covered by existing tests
  • verified shape and numerical changes in a notebook

@martinjankowiak
Copy link
Collaborator

Refactors relaxed logic to refer to a global _RELAX_MIN_VARIANCE variable, and lowers the value from 0.25 to 0.1, which seems to improve inference.

@fritzo any hypothesis as to why?

@fritzo
Copy link
Member Author

fritzo commented Jun 15, 2020

lowers _RELAX_MIN_VARIANCE from 0.25 to 0.1, which seems to improve inference.

@fritzo any hypothesis as to why?

Yes, I believe there is a bias-variance tradeoff where very small _RELAX_MIN_VARIANCE leads to poor mixing because of infeasibility (or high rejection rate, causing small step size), but large _RELAX_MIN_VARIANCE leads to spurious infections due to leakage at zero. Anecdotally, the value 0.1 seems balance the tradeoff, but I have not yet done a thorough comparison.

Comment on lines +411 to +416
# Try to initialize kernel.transforms using kernel.setup().
if getattr(self.kernel, "transforms", None) is None:
warmup_steps = 0
self.kernel.setup(warmup_steps, *args, **kwargs)
# Use `kernel.transforms` when available
if hasattr(self.kernel, 'transforms') and self.kernel.transforms is not None:
if getattr(self.kernel, "transforms", None) is not None:
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is required because manually calling initialize_model() below would ignore init_strategy and attempt to sample from ImproperUniform. Instead we let kernel.setup() call initialize_model() with proper arguments.

@@ -92,16 +93,17 @@ def hook_fn(kernel, *unused):
warmup_steps=args.warmup_steps,
num_samples=args.num_samples,
num_chains=args.num_chains,
mp_context="spawn" if parallel else None,
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

unfortunately i've found this to be somewhat buggy (fair number of crashes)

Copy link
Member Author

@fritzo fritzo Jun 15, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

do you use "forkserver" instead?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no i've used spawn. but i've tried to avoid it...

@fritzo
Copy link
Member Author

fritzo commented Jun 16, 2020

Phew 😅 thanks for reviewing!

@martinjankowiak martinjankowiak merged commit 4996a77 into dev Jun 16, 2020
@fritzo fritzo deleted the sir-misc-fixes branch July 14, 2020 01:24
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants