Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Euclid requisites -- CAMB & CLASS #222

Merged
merged 52 commits into from
Feb 22, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
52 commits
Select commit Hold shift + click to select a range
1186864
classy: sigma8(z) support, and test (for CAMB too)
JesusTorrado Oct 21, 2021
507dadf
camb: Omega_{b,cdm,nu_massive,m}
JesusTorrado Dec 14, 2021
e7647df
PowerSpectrumInterpolator: warning for __call__ method
JesusTorrado Dec 15, 2021
0e41050
PowerSpectrumInterpolator: added extrap_kmin and some error control
JesusTorrado Dec 15, 2021
cab46ba
BoltzmannBase: remove Omega_b and abstracted and documented the other…
JesusTorrado Dec 15, 2021
a852565
PowerSpectrumInterpolator: test robustness
JesusTorrado Dec 15, 2021
00d3995
BoltzmannBase+CAMB+classy: abstracted how z's etc are combined
JesusTorrado Dec 15, 2021
19dcecf
PowerSpectrumInterpolator: more robust bounds check
JesusTorrado Dec 16, 2021
02133f1
CAMB/CLASS: get_z_dependent abstracted and more robust
JesusTorrado Dec 16, 2021
ee1f055
cosmo:get_z_dependent: fixed corner case
JesusTorrado Dec 16, 2021
c18de48
boltzmanncode: adaptive tolerance for choice from 1d list
JesusTorrado Jan 31, 2022
ecbee98
camb: abstracted Pool1D for retrieving values [skip travis]
JesusTorrado Jan 31, 2022
cc78c47
classy: abstracted Pool1D for retrieving values
JesusTorrado Feb 1, 2022
c23bb05
tools: value pools abstracted to N-d
JesusTorrado Feb 1, 2022
5ba8232
boltzmann: fixes for 2D pool and starting with angular_diameter_dista…
JesusTorrado Feb 1, 2022
160b8c3
camb: Omega_{b,cdm,nu_massive,m}
JesusTorrado Dec 14, 2021
2a923cf
PowerSpectrumInterpolator: warning for __call__ method
JesusTorrado Dec 15, 2021
808b470
PowerSpectrumInterpolator: added extrap_kmin and some error control
JesusTorrado Dec 15, 2021
b619bf1
BoltzmannBase: remove Omega_b and abstracted and documented the other…
JesusTorrado Dec 15, 2021
b8595f6
PowerSpectrumInterpolator: test robustness
JesusTorrado Dec 15, 2021
81058a8
BoltzmannBase+CAMB+classy: abstracted how z's etc are combined
JesusTorrado Dec 15, 2021
364d9ed
PowerSpectrumInterpolator: more robust bounds check
JesusTorrado Dec 16, 2021
f329d05
CAMB/CLASS: get_z_dependent abstracted and more robust
JesusTorrado Dec 16, 2021
ab15872
cosmo:get_z_dependent: fixed corner case
JesusTorrado Dec 16, 2021
3e409c3
boltzmanncode: adaptive tolerance for choice from 1d list
JesusTorrado Jan 31, 2022
4a51a30
camb: abstracted Pool1D for retrieving values [skip travis]
JesusTorrado Jan 31, 2022
52c8319
classy: abstracted Pool1D for retrieving values
JesusTorrado Feb 1, 2022
93bff03
tools: value pools abstracted to N-d
JesusTorrado Feb 1, 2022
70bc49c
boltzmann: fixes for 2D pool and starting with angular_diameter_dista…
JesusTorrado Feb 1, 2022
71019ec
Merge branch 'euclid_requisites_camb' of github.com:cobayasampler/cob…
JesusTorrado Feb 1, 2022
0e5a877
Merge branch 'classy_sigma8_z' into euclid_requisites_camb
JesusTorrado Feb 1, 2022
c3607a4
linting
JesusTorrado Feb 1, 2022
b08ee27
camb: angular_diameter_distance_2 working with CAMB:master
JesusTorrado Feb 1, 2022
5befad1
camb: angular_diameter_distance_2 returns 0 for z1>= z2
JesusTorrado Feb 3, 2022
6ac9cf8
added --minimize flag to cobaya-run
JesusTorrado Feb 3, 2022
2c0aff0
class: update to v3.1.1
JesusTorrado Feb 3, 2022
9d861c7
boltzmann: Pk_interpolator doc
JesusTorrado Feb 4, 2022
bc9dd63
trivial typos
cmbant Feb 4, 2022
f9b90f6
fixes
JesusTorrado Feb 4, 2022
38454ac
Pool1D: try fast search first
JesusTorrado Feb 7, 2022
c5f0fb9
PoolXD: fast check for Pool2D and tests for pools
JesusTorrado Feb 8, 2022
9ad5e53
classy: Omega_X and sigma8 (and tests for CAMB too)
JesusTorrado Feb 8, 2022
7812d9f
classy: ang_diam_dist_2, and some code merging
JesusTorrado Feb 8, 2022
ce76b20
classy: fsigma8
JesusTorrado Feb 8, 2022
7865ed8
classy: Weyl Pkz
JesusTorrado Feb 9, 2022
2b242c5
classy: new in 3.0: halofix|hmcode_min_k_max --> nonlinear_min_k_max
JesusTorrado Feb 15, 2022
e7c020e
classy: sigma(R,z) interfaced and test (CAMB too) [skip travis]
JesusTorrado Feb 15, 2022
ade8dfa
boltzmann: better error msg when recovering z-pair-dependent [skip tr…
JesusTorrado Feb 15, 2022
add8476
Merge branch 'master' into euclid_requisites_camb
cmbant Feb 15, 2022
06dd0c4
typo
cmbant Feb 15, 2022
7da498b
CHANGELOG, get_sigma_R return order, and other small stuff
JesusTorrado Feb 22, 2022
e9e0957
Merge branch 'master' into euclid_requisites_camb
JesusTorrado Feb 22, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,7 @@ docs/src_examples/*/*.png
# Tests
.pytest_cache
tests/.ipynb_checkpoints
.coverage*

# IDE-specific
.idea
Expand Down
10 changes: 10 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,12 +2,22 @@

- Documented uses of `Model` class in general contexts (previously only cosmo)
- `Model` methods to compute log-probabilities and derived parameters now have an `as_dict` keyword (default `False`), for more informative return value.
- Added ``--minimize`` flag to ``cobaya-run`` for quick minimization (replaces sampler, uses previous output).

### Cosmological likelihoods and theory codes

- `Pk_interpolator`: added extrapolation up to `extrap_kmin` and improved robustness

#### CAMB

- Removed problematic `zrei: zre` alias (fixes #199, thanks @pcampeti)
- Added `Omega_b|cdm|nu_massive(z)` and `angular_diameter_distance_2`
- Returned values for `get_sigma_R` changed from `R, z, sigma(z, R)` to `z, R, sigma(z, R)`.

#### CLASS

- Updated to v3.1.2
- Added `Omega_b|cdm|nu_massive(z)`, `angular_diameter_distance_2`, `sigmaR(z)`, `sigma8(z)`, `fsgima8(z)` and Weyl potential power spectrum.

#### BAO

Expand Down
2 changes: 1 addition & 1 deletion cobaya/cosmo_input/input_database.py
Original file line number Diff line number Diff line change
Expand Up @@ -399,7 +399,7 @@

# EXPERIMENTS ############################################################################
base_precision: InfoDict = {"camb": {"halofit_version": "mead"},
"classy": {"non linear": "hmcode", "hmcode_min_k_max": 20}}
"classy": {"non linear": "hmcode", "nonlinear_min_k_max": 20}}
cmb_precision = deepcopy(base_precision)
cmb_precision["camb"].update({"bbn_predictor": "PArthENoPE_880.2_standard.dat",
"lens_potential_accuracy": 1})
Expand Down
6 changes: 3 additions & 3 deletions cobaya/input.py
Original file line number Diff line number Diff line change
Expand Up @@ -168,10 +168,10 @@ def get_info_path(folder, prefix, infix=None, kind="updated", ext=Extension.yaml

def get_used_components(*infos, return_infos=False):
"""
Returns all requested components as an dict ``{kind: set([components])}``.
Returns all requested components as a dict ``{kind: set([components])}``.
Priors are not included.

If ``return_infos=True`` (default: ``False``), returns too a dictionary of inputs per
If ``return_infos=True`` (default: ``False``), also returns a dictionary of inputs per
component, updated in the order in which the info arguments are given.

Components which are just renames of others (i.e. defined with `class_name`) return
Expand Down Expand Up @@ -640,7 +640,7 @@ def get_class_path(cls) -> str:
def get_file_base_name(cls) -> str:
"""
Gets the string used as the name for .yaml, .bib files, typically the
class name or a un-CamelCased class name
class name or an un-CamelCased class name
"""
return cls.__dict__.get('file_base_name') or cls.__name__

Expand Down
4 changes: 2 additions & 2 deletions cobaya/likelihoods/base_classes/des.py
Original file line number Diff line number Diff line change
Expand Up @@ -567,10 +567,10 @@ def chi_squared(self, theory, return_theory_vector=False):

def logp(self, **params_values):
PKdelta = self.provider.get_Pk_interpolator(("delta_tot", "delta_tot"),
extrap_kmax=500 * self.acc)
extrap_kmax=3000 * self.acc)
if self.use_Weyl:
PKWeyl = self.provider.get_Pk_interpolator(("Weyl", "Weyl"),
extrap_kmax=500 * self.acc)
extrap_kmax=3000 * self.acc)
else:
PKWeyl = None

Expand Down
4 changes: 2 additions & 2 deletions cobaya/likelihoods/base_classes/sn.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,13 +19,13 @@
.. note::

- If you use ``sn.pantheon``, please cite:|br|
Scolnic, D. M. et al,
Scolnic, D. M. et al.,
`The Complete Light-curve Sample of Spectroscopically
Confirmed Type Ia Supernovae from Pan-STARRS1 and
Cosmological Constraints from The Combined Pantheon Sample`
`(arXiv:1710.00845) <https://arxiv.org/abs/1710.00845>`_
- If you use ``sn.jla`` or ``sn.jla_lite``, please cite:|br|
Betoule, M. et al,
Betoule, M. et al.,
`Improved cosmological constraints from a joint analysis
of the SDSS-II and SNLS supernova samples`
`(arXiv:1401.4064) <https://arxiv.org/abs/1401.4064>`_
Expand Down
2 changes: 1 addition & 1 deletion cobaya/log.py
Original file line number Diff line number Diff line change
Expand Up @@ -136,7 +136,7 @@ def logger_setup(debug=None, debug_file=None):
"""
Configuring the root logger, for its children to inherit level, format and handlers.

Level: if debug=True, take DEBUG. If numerical, use "logging"'s corresponding level.
Level: if debug=True, take DEBUG. If numerical, use ""logging""'s corresponding level.
Default: INFO
"""
if debug is True or os.getenv('COBAYA_DEBUG'):
Expand Down
2 changes: 1 addition & 1 deletion cobaya/model.py
Original file line number Diff line number Diff line change
Expand Up @@ -586,7 +586,7 @@ def get_valid_point(self, max_tries: int, ignore_fixed_ref: bool = False,
) -> Union[Tuple[np.ndarray, LogPosterior],
Tuple[np.ndarray, dict]]:
"""
Finds a point with finite posterior, sampled from from the reference pdf.
Finds a point with finite posterior, sampled from the reference pdf.

It will fail if no valid point is found after `max_tries`.

Expand Down
2 changes: 1 addition & 1 deletion cobaya/mpi.py
Original file line number Diff line number Diff line change
Expand Up @@ -173,7 +173,7 @@ def allgather(data) -> list:

def zip_gather(list_of_data, root=0) -> Iterable[tuple]:
"""
Takes a list of items and returns a iterable of lists of items from each process
Takes a list of items and returns an iterable of lists of items from each process
e.g. for root node
[(a_1, a_2),(b_1,b_2),...] = zip_gather([a,b,...])
"""
Expand Down
2 changes: 1 addition & 1 deletion cobaya/output.py
Original file line number Diff line number Diff line change
Expand Up @@ -369,7 +369,7 @@ def check_and_dump_info(self, input_info, updated_info, check_compatible=True,
"%s:%s, but you are trying to resume a "
"run that used a newer version: %r.",
new_version, k, c, old_version)
# If resuming, we don't want to to *partial* dumps
# If resuming, we don't want to do *partial* dumps
if ignore_blocks and self.is_resuming():
return
# Work on a copy of the input info, since we are updating the prefix
Expand Down
2 changes: 1 addition & 1 deletion cobaya/parameterization.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def is_derived_param(info_param: ParamInput) -> bool:

def expand_info_param(info_param: ParamInput, default_derived=True) -> ParamDict:
"""
Expands the info of a parameter, from the user friendly, shorter format
Expands the info of a parameter, from the user-friendly, shorter format
to a more unambiguous one.
"""
info_param = deepcopy_where_possible(info_param)
Expand Down
10 changes: 5 additions & 5 deletions cobaya/prior.py
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
as they are understood for each particular pdf in :class:`scipy.stats`; e.g. for a
``uniform`` pdf, ``loc`` is the lower bound and ``scale`` is the length of the domain,
whereas in a gaussian (``norm``) ``loc`` is the mean and ``scale`` is the standard
deviation).
deviation.
+ Additional specific parameters of the distribution, e.g. ``a`` and ``b`` as the powers
of a Beta pdf.

Expand Down Expand Up @@ -112,8 +112,8 @@
custom priors "`external` priors".

Inside the ``prior`` block, list a pair of priors as ``[name]: [function]``, where the
functions must return **log**-priors. This priors will be multiplied by the
one-dimensional ones defined above. Even if you define an prior for some parameters
functions must return **log**-priors. These priors will be multiplied by the
one-dimensional ones defined above. Even if you define a prior for some parameters
in the ``prior`` block, you still have to specify their bounds in the ``params`` block.

A prior function can be specified in two different ways:
Expand Down Expand Up @@ -144,7 +144,7 @@

External priors can only be functions **sampled** and **fixed**
and **derived** parameters that are dynamically defined in terms of other inputs.
Derived parameters computed by the theory code cannot be used in in a prior, since
Derived parameters computed by the theory code cannot be used in a prior, since
otherwise the full prior could not be computed **before** the likelihood,
preventing us from avoiding computing the likelihood when the prior is null, or
forcing a *post-call* to the prior.
Expand Down Expand Up @@ -183,7 +183,7 @@
Defining parameters dynamically
-------------------------------

We may want to sample in a parameter space different than the one understood by the
We may want to sample in a parameter space different from the one understood by the
likelihood, e.g. because we expect the posterior to be simpler on the alternative
parameters.

Expand Down
15 changes: 14 additions & 1 deletion cobaya/run.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ def run(info_or_yaml_or_file: Union[InputDict, str, os.PathLike],
debug: Union[bool, int, None] = None,
stop_at_error: Optional[bool] = None,
resume: bool = False, force: bool = False,
minimize: Optional[bool] = None,
no_mpi: bool = False, test: bool = False,
override: Optional[InputDict] = None,
) -> Union[InfoSamplerTuple, PostTuple]:
Expand All @@ -50,6 +51,7 @@ def run(info_or_yaml_or_file: Union[InputDict, str, os.PathLike],
:param stop_at_error: stop if an error is raised
:param resume: continue an existing run
:param force: overwrite existing output if it exists
:param minimize: if true, ignores the sampler and runs default minimizer
:param no_mpi: run without MPI
:param test: only test initialization rather than actually running
:param override: option dictionary to merge into the input one, overriding settings
Expand All @@ -59,7 +61,7 @@ def run(info_or_yaml_or_file: Union[InputDict, str, os.PathLike],
"""

# This function reproduces the model-->output-->sampler pipeline one would follow
# when instantiating by hand, but alters the order to performs checks and dump info
# when instantiating by hand, but alters the order to perform checks and dump info
# as early as possible, e.g. to check if resuming possible or `force` needed.
if no_mpi or test:
mpi.set_mpi_disabled()
Expand All @@ -77,6 +79,9 @@ def run(info_or_yaml_or_file: Union[InputDict, str, os.PathLike],
info["resume"] = bool(resume)
info["force"] = bool(force)
if info.get("post"):
if minimize:
raise ValueError(
"``minimize`` option is incompatible with post-processing.")
if isinstance(output, str) or output is False:
info["post"]["output"] = output or None
return post(info)
Expand All @@ -94,6 +99,11 @@ def run(info_or_yaml_or_file: Union[InputDict, str, os.PathLike],
# GetDist needs to know the original sampler, so don't overwrite if minimizer
try:
which_sampler = list(info["sampler"])[0]
if minimize:
# Preserve options if "minimize" was already the sampler
if which_sampler.lower() != "minimize":
info["sampler"] = {"minimize": None}
which_sampler = "minimize"
except (KeyError, TypeError):
raise LoggedError(
logger_run, "You need to specify a sampler using the 'sampler' key "
Expand Down Expand Up @@ -192,6 +202,9 @@ def run_script(args=None):
"(use with care!)")
parser.add_argument("--%s" % "test", action="store_true",
help="Initialize model and sampler, and exit.")
parser.add_argument("--minimize", action="store_true",
help=("Replaces the sampler in the input and runs a minimization "
"process (incompatible with post-processing)."))
parser.add_argument("--version", action="version", version=get_version())
parser.add_argument("--no-mpi", action='store_true',
help="disable MPI when mpi4py installed but MPI does "
Expand Down
2 changes: 1 addition & 1 deletion cobaya/samplers/mcmc/proposal.py
Original file line number Diff line number Diff line change
Expand Up @@ -255,7 +255,7 @@ def set_covariance(self, propose_matrix):
"""
Take covariance of sampled parameters (propose_matrix), and construct orthonormal
parameters where orthonormal parameters are grouped in blocks by speed, so changes
in slowest block changes slow and fast parameters, but changes in the fastest
in the slowest block changes slow and fast parameters, but changes in the fastest
block only changes fast parameters

:param propose_matrix: covariance matrix for the sampled parameters.
Expand Down
50 changes: 31 additions & 19 deletions cobaya/samplers/minimize/minimize.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,12 +6,10 @@

This is a **maximizer** for posteriors or likelihoods, based on
`scipy.optimize.Minimize <https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html>`_
and `Py-BOBYQA <https://numericalalgorithmsgroup.github.io/pybobyqa/build/html/index.html>`_
(added in 2.0).
and `Py-BOBYQA <https://numericalalgorithmsgroup.github.io/pybobyqa/build/html/index.html>`_.

.. note::

BOBYQA tends to work better on Cosmological problems with the default settings.
The default is BOBYQA, which tends to work better on Cosmological problems with default
settings.

.. |br| raw:: html

Expand All @@ -36,24 +34,30 @@
**If you use scipy**, you can find `the appropriate references here
<https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.minimize.html>`_.

It works more effectively when run on top of a Monte Carlo sample: just change the sampler
for ``Minimize`` with the desired options, and it will use as a starting point the
*maximum a posteriori* (MAP) or best fit (maximum likelihood, or minimal :math:`\chi^2`)
found so far, as well as the covariance matrix of the sample for rescaling of the
parameter jumps.
It works more effectively when run on top of a Monte Carlo sample: it will use the maximum
a posteriori as a starting point (or the best fit, depending on whether the prior is
ignored, :ref:`see below <minimize_like>`), and the recovered covariance matrix of the
posterior to rescale the variables.

To take advantage of a previous run with a Monte Carlo sampler, either:

- change the ``sampler`` to ``minimize`` in the input file,

- or, if running from the shell, repeat the ``cobaya-run`` command used for the original
run, adding the ``--minimize`` flag.

As text output, it produces two different files:
When called from a Python script, Cobaya's ``run`` function returns the updated info
and the products described below in the method
:func:`samplers.minimize.Minimize.products` (see below).

If text output is requested, it produces two different files:

- ``[output prefix].minimum.txt``, in
:ref:`the same format as Cobaya samples <output_format>`,
but containing a single line.

- ``[output prefix].minimum``, the equivalent **GetDist-formatted** file.

If ``ignore_prior: True``, those files are named ``.bestfit[.txt]`` instead of ``minimum``,
and contain the best-fit (maximum of the likelihood) instead of the MAP
(maximum of the posterior).

.. warning::

For historical reasons, in the first two lines of the GetDist-formatted output file
Expand All @@ -62,10 +66,6 @@
:math:`-2` times the sum of the individual :math:`\chi^2` (``chi2__``, with double
underscore) in the table that follows these first lines.

When called from a Python script, Cobaya's ``run`` function returns the updated info
and the products described below in the method
:func:`products <samplers.Minimize.Minimize.products>`.

It is recommended to run a couple of parallel MPI processes:
it will finally pick the best among the results.

Expand All @@ -77,6 +77,18 @@
want to refine the convergence parameters (see ``override`` options in the ``yaml``
below).


.. _minimize_like:

Maximizing the likelihood instead of the posterior
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

To maximize the likelihood, add ``ignore_prior: True`` in the ``minimize`` input block.

When producing text output, the generated files are named ``.bestfit[.txt]`` instead of
``minimum``, and contain the best-fit (maximum of the likelihood) instead of the MAP
(maximum of the posterior).

"""

# Global
Expand Down
2 changes: 2 additions & 0 deletions cobaya/samplers/polychord/polychord.py
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,8 @@ class polychord(Sampler):
oversample_power: float
nlive: NumberWithUnits
path: str
logzero: float
max_ndead: int

def initialize(self):
"""Imports the PolyChord sampler and prepares its arguments."""
Expand Down
Loading