You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the above example, we will get a log evidence of logZ=0, as expected for the one likelihood.
We would also expect a zero Kullback-Leibler divergence D_KL=0 for the one likelihood, however, using the raw PolyChord data (e.g. with anesthetic's NestedSamples.ns_output()), we incorrectly get something non-zero.
Suspected reason:
I think this happens because the prior density is not fully renormalised to [0,1] as @JesusTorrado mentions in his answer to #77:
Indeed, instead of renormalising the prior density to [0,1] using pc_prior and then passing the pure likelihood, I am using pc_prior to just renormalise with respect to the bounds, and passing the density-part of the priors together with the likelihood. This way, we can allow for 1d priors for which we don't have an automatic renormalisation to [0,1].
This effectively turns the non-uniform prior into a contribution to the likelihood. This is fine for the evidence Z, where likelihood L and prior p enter as a product (as long as they enter as a product you can shovel prior to likelihood or likelihood to prior as you like):
logZ=logsumexp(logL+logp)
The Kullback-Leibler divergence on the other hand has likelihood and prior entering as a product only for the first part. For the second part the likelihood enters on its own (assuming logL and logp known and logZ from above):
Could we maybe make use of the .ppf (inverse of .cdf) function that comes with the 1d scipy.stats distributions to do the conversion from unit hypercube to physical parameter?
I'm thinking along the lines of
maybe replacing line 183 in cobaya/samplers/polychord/polychord.py:
- self.pc_prior = lambda x: (locs + np.array(x)[self.ordering] * scales).tolist()+ self.pc_prior = lambda x: [prior.ppf[i](xi) for i, xi in enumerate(np.array(x)[self.ordering])]
and passing the pure likelihood instead of logposterior + self.logvolume to run_polychord in lines 236-260.
I am unsure, though, how easily accessible the 1d prior distributions are in polychord.py.
I should also say that I am obviously not extremely familiar with the details of Cobaya's workings, so quite possibly I have overlooked something important. Would be curious to know whether you think this might be possible, @JesusTorrado. Maybe @williamjameshandley has some suggestions, too?
The text was updated successfully, but these errors were encountered:
Thanks for the issue! Using the ppf has been in my to-do list for quite a while, but since evidences did work without it, never got around to do it.
I do think it's possible. It will just take my a bit to draft an implementation (on paternity and trying to finish other projects), but will try at least to create a test branch with a draft implementation for you very soon.
I assume you are talking in particular of KL shrinkage from prior to posterior, right?
Notice that this will not work out of the box for multidimensional priors.
I assume you are talking in particular of KL shrinkage from prior to posterior, right?
Yes, exactly.
Notice that this will not work out of the box for multidimensional priors.
Yes, multidimensional priors would require a bigger change. Would be a great addition in the long run, though.
I do think it's possible. It will just take my a bit to draft an implementation (on paternity and trying to finish other projects), but will try at least to create a test branch with a draft implementation for you very soon.
That sounds great. I'd be happy to help, e.g. review a PR, if that's useful.
Example highlighting the problem:
Simple 1-dimensional example using the
one
likelihood (one
likelihood for simplicity of the example, problem exists generally, though):In the above example, we will get a log evidence of
logZ=0
, as expected for theone
likelihood.We would also expect a zero Kullback-Leibler divergence
D_KL=0
for theone
likelihood, however, using the raw PolyChord data (e.g. with anesthetic'sNestedSamples.ns_output()
), we incorrectly get something non-zero.Suspected reason:
I think this happens because the prior density is not fully renormalised to [0,1] as @JesusTorrado mentions in his answer to #77:
This effectively turns the non-uniform prior into a contribution to the likelihood. This is fine for the evidence
Z
, where likelihoodL
and priorp
enter as a product (as long as they enter as a product you can shovel prior to likelihood or likelihood to prior as you like):The Kullback-Leibler divergence on the other hand has likelihood and prior entering as a product only for the first part. For the second part the likelihood enters on its own (assuming
logL
andlogp
known andlogZ
from above):Potential improvement:
Could we maybe make use of the
.ppf
(inverse of.cdf
) function that comes with the 1dscipy.stats
distributions to do the conversion from unit hypercube to physical parameter?I'm thinking along the lines of
maybe replacing line 183 in
cobaya/samplers/polychord/polychord.py
:and passing the pure likelihood instead of
logposterior + self.logvolume
torun_polychord
in lines 236-260.I am unsure, though, how easily accessible the 1d prior distributions are in
polychord.py
.I should also say that I am obviously not extremely familiar with the details of Cobaya's workings, so quite possibly I have overlooked something important. Would be curious to know whether you think this might be possible, @JesusTorrado. Maybe @williamjameshandley has some suggestions, too?
The text was updated successfully, but these errors were encountered: