Releases: jfcrenshaw/pzflow
Releases · jfcrenshaw/pzflow
v3.1.3
v3.1.2
Changed jnp.NINF
to -jnp.inf
as the former has been removed from Jax.
v3.1.1
- Fixed typo in info for CentBeta13 which prevented saving and then loading a flow with this distribution
v3.1.0
- Added validation loss tracking to the training method
- After training, rather than saving the parameters from the final epoch of training, the parameters from the epoch with the best loss are saved. If a validation set is provided, this reference loss is the validation loss. If not, the reference loss is the training loss.
v3.0.3
- Fixed overflow bug on Uniform distribution log_prob when number of dimensions is greater than 10
- Updated dependencies
v3.0.2
Added -> None
return signatures to functions and methods that don't return an object.
v3.0.0 / v3.0.1
Release of PZFlow 3.0.0 (the 0.0.1 was just to fix the README on PyPI and doesn't reflect any changes to the code)
Changes
- Made Uniform the default distribution
- Made all distributions symmetric about zero
- Moved to Optax for optimization
- Added default bijectors to
Flow
andFlowEnsemble
.
Features
- Added early stopping to train methods
- Added progress bar to train methods
- Added
CentBeta13
distribution, which looks like a Guassian with hard cutoffs.
Bugs
- Fixed bug where
FlowEnsemble
metadata was not available from the ensemble itself, only from the individual flows in the esemble
v2.0.7
- Made some fixes so that pzflow works with the latest version of jax
- I had to change the Uniform Dequantizer so that you always have to specify which columns to dequantize
v2.0.6
Changes for last few bug fixes:
2.0.4 - fixed conditional sampling for FlowEnsembles so that it is obvious which sample belongs to which condition.
2.0.5 - fixed conversion of samples to pandas DataFrames, so that the dtype is float rather than object
2.0.6 - fixed error in CentBeta distribution where output shape was accidentally printed whenever calling log_prob
v2.0.3
Fixed error where marginalization yields NaNs when combined with error convolution.