Simulation-based Inference with the Generalized Kullback-Leibler Divergence

Published: 28 Jul 2023, Last Modified: 28 Jul 2023SynS & ML @ ICML2023EveryoneRevisionsBibTeX
Keywords: simulation-based inference, likelihood-free inference, implicit likelihood, generalized energy-based model, unnormalized distribution estimation, hybrid model
TL;DR: The generalized kullback-leibler divergence enables fitting better models to the posterior in simulation-based inference.
Abstract: In Simulation-based Inference, the goal is to solve the inverse problem when the likelihood is only known implicitly. Neural Posterior Estimation commonly fits a normalized density estimator as a surrogate model for the posterior. This formulation cannot easily fit unnormalized surrogates because it optimizes the Kullback-Leibler divergence. We propose to optimize a generalized Kullback-Leibler divergence that accounts for the normalization constant in unnormalized distributions. The objective recovers Neural Posterior Estimation when the model class is normalized and unifies it with Neural Ratio Estimation, combining both into a single objective. We investigate a hybrid model that offers the best of both worlds by learning a normalized base distribution and a learned ratio. We also present benchmark results.
Submission Number: 27
Loading