Generalizing self-normalized importance sampling with couplings

Published: 27 Jun 2024, Last Modified: 09 Oct 2024arXivEveryoneRevisionsCC BY 4.0
Abstract: An essential problem in statistics and machine learning is the estimation of expectations involving probability density functions (PDFs) with intractable normalizing constants. The selfnormalized importance sampling (SNIS) estimator, which normalizes the importance sampling (IS) weights, has become the standard approach due to its simplicity. However, the SNIS has been shown to exhibit high variance in challenging estimation problems, e.g, involving rare events or posterior predictive distributions in Bayesian statistics. Further, most of the state-ofthe-art adaptive importance sampling (AIS) methods adapt the proposal as if the weights had not been normalized. In this paper, we propose a framework that considers the original task as estimation of a ratio of two integrals. In our new formulation, we obtain samples from a joint proposal distribution in an extended space, with two of its marginals playing the role of proposals used to estimate each integral. Importantly, the framework allows us to induce and control a dependency between both estimators. We propose a construction of the joint proposal that decomposes in two (multivariate) marginals and a coupling. This leads to a two-stage framework suitable to be integrated with existing or new AIS and/or variational inference (VI) algorithms. The marginals are adapted in the first stage, while the coupling can be chosen and adapted in the second stage. We show in several realistic examples the benefits of the proposed methodology, including an application to Bayesian prediction with misspecified models
Loading