A Projection Operator to Balance Consistency and Complexity in Importance SamplingDownload PDF

16 Oct 2019 (modified: 05 May 2023)AABI 2019Readers: Everyone
Keywords: Monte Carlo methods, Importance sampling
TL;DR: We proposed a novel compressed kernelized importance sampling algorithm.
Abstract: Importance sampling (IS) is a standard Monte Carlo (MC) tool to compute information about random variables such as moments or quantiles with unknown distributions. IS is asymptotically consistent as the number of MC samples, and hence deltas (particles) that parameterize the density estimate, go to infinity. However, retaining infinitely many particles is intractable. We propose a scheme for only keeping a \emph{finite representative subset} of particles and their augmented importance weights that is \emph{nearly consistent}. To do so in {an online manner}, we approximate importance sampling in two ways. First, we replace the deltas by kernels, yielding kernel density estimates (KDEs). Second, we sequentially project KDEs onto nearby lower-dimensional subspaces. We characterize the asymptotic bias of this scheme as determined by a compression parameter and kernel bandwidth, which yields a tunable tradeoff between consistency and memory. In experiments, we observe a favorable tradeoff between memory and accuracy, providing for the first time near-consistent compressions of arbitrary posterior distributions.
0 Replies

Loading