diff --git a/index.html b/index.html index 4a0ae41..c395f2b 100644 --- a/index.html +++ b/index.html @@ -1,27 +1,5 @@ -search_query=cat:astro-ph.*+AND+lastUpdatedDate:[202407242000+TO+202407302000]&start=0&max_results=5000 -
Bayesian inference for inverse problems hinges critically on the choice of -priors. In the absence of specific prior information, population-level -distributions can serve as effective priors for parameters of interest. With -the advent of machine learning, the use of data-driven population-level -distributions (encoded, e.g., in a trained deep neural network) as priors is -emerging as an appealing alternative to simple parametric priors in a variety -of inverse problems. However, in many astrophysical applications, it is often -difficult or even impossible to acquire independent and identically distributed -samples from the underlying data-generating process of interest to train these -models. In these cases, corrupted data or a surrogate, e.g. a simulator, is -often used to produce training samples, meaning that there is a risk of -obtaining misspecified priors. This, in turn, can bias the inferred posteriors -in ways that are difficult to quantify, which limits the potential -applicability of these models in real-world scenarios. In this work, we propose -addressing this issue by iteratively updating the population-level -distributions by retraining the model with posterior samples from different -sets of observations and showcase the potential of this method on the problem -of background image reconstruction in strong gravitational lensing when -score-based models are used as data-driven priors. We show that starting from a -misspecified prior distribution, the updated distribution becomes progressively -closer to the underlying population-level distribution, and the resulting -posterior samples exhibit reduced bias after several updates.
Reionization is one of the least understood processes in the evolution history of the Universe, mostly because of the numerous astrophysical processes occurring simultaneously about which we do not have a very clear idea so far. @@ -53,7 +31,17 @@
Bayesian Inference with Markov Chain Monte Carlo requires efficient +computation of the likelihood function. In some scientific applications, the +likelihood must be computed by numerically solving a partial differential +equation, which can be prohibitively expensive. We demonstrate that some such +problems can be made tractable by amortizing the computation with a surrogate +likelihood function implemented by a neural network. We show that this has two +additional benefits: reducing noise in the likelihood evaluations and providing +fast gradient calculations. In experiments, the approach is applied to a model +of heliospheric transport of galactic cosmic rays, where it enables efficient +sampling from the posterior of latent parameters in the Parker equation.
We present the first application of the Wavelet Scattering Transform (WST) in order to constrain the nature of gravity using the three-dimensional (3D) large-scale structure of the universe. Utilizing the Quijote-MG N-body @@ -74,7 +62,25 @@
We carry out a Bayesian analysis of dark matter (DM) direct detection data to +determine particle model parameters using the Truncated Marginal Neural Ratio +Estimation (TMNRE) machine learning technique. TMNRE avoids an explicit +calculation of the likelihood, which instead is estimated from simulated data, +unlike in traditional Markov Chain Monte Carlo (MCMC) algorithms. This +considerably speeds up, by several orders of magnitude, the computation of the +posterior distributions, which allows to perform the Bayesian analysis of an +otherwise computationally prohibitive number of benchmark points. In this +article we demonstrate that, in the TMNRE framework, it is possible to include, +combine, and remove different datasets in a modular fashion, which is fast and +simple as there is no need to re-train the machine learning algorithm or to +define a combined likelihood. In order to assess the performance of this +method, we consider the case of WIMP DM with spin-dependent and independent +interactions with protons and neutrons in a xenon experiment. After validating +our results with MCMC, we employ the TMNRE procedure to determine the regions +where the DM parameters can be reconstructed. Finally, we present CADDENA, a +Python package that implements the modular Bayesian analysis of direct +detection experiments described in this work.
In inference problems, we often have domain knowledge which allows us to define summary statistics that capture most of the information content in a