Projection Killer: peering through high dimensional posterior distribution

Published: 17 Jun 2024, Last Modified: 25 Jul 2024ICML2024-AI4Science PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: normalizing flows, posterior estimation, density estimation
Abstract: Many modern applications of Bayesian inference, such as cosmology, are based on complicated forward models with high-dimensional parameter spaces. This considerably limits sampling of posterior distributions conditioned on observed data. In turn, this reduces the interpretability of posteriors to their one- and two-dimensional marginal distributions, when more information is available in the full dimensional distributions. We propose to learn smooth and differentiable representations of posterior distributions from their samples using normalizing flows, which we train with an added evidence error loss term, to extend interpretability in multiple ways. Motivated by problems from cosmology, we implement a robust method to obtain one and two-dimensional posterior profiles. These are obtained by optimizing, instead of integrating, over other parameters, and are thus less prone than marginals to so-called projection effects. We also demonstrate how this representation provides an accurate estimator of the Bayesian evidence, with log error at the 0.2 level, allowing accurate model comparison. We test our method on multi-modal mixtures of Gaussians up to dimension 32 before applying it to simulated cosmology examples.
Submission Number: 73
Loading