Sampling with Mirrored Stein OperatorsDownload PDF

Published: 29 Jan 2022, Last Modified: 22 Oct 2023AABI 2022 PosterReaders: Everyone
Keywords: Stein's method, Sampling, Mirror descent, Natural gradient descent, Probabilistic inference, Bayesian inference, Post-selection inference, Stein operators
TL;DR: We introduce a multi-particle generalization of mirror descent for sampling that works well for constrained domains and can exploit the geometry of the problem.
Abstract: Accurately approximating an unnormalized distribution with a discrete sample is a fundamental challenge in machine learning, probabilistic inference, and Bayesian inference. Particle evolution methods like Stein variational gradient descent have found great success in approximating unconstrained distributions but break down for constrained targets. We introduce a new family of particle evolution samplers suitable for constrained domains and non-Euclidean geometries. They minimize the Kullback-Leibler (KL) divergence to constrained target distributions by evolving particles in a dual space defined by a mirror map. We derive these samplers from a new class of mirrored Stein operators and adaptive kernels developed in this work. We establish the convergence of our new procedures under verifiable conditions on the target distribution. Finally, we demonstrate that these new samplers yield accurate approximations to distributions on the simplex and deliver valid confidence intervals in post-selection inference.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2106.12506/code)
1 Reply

Loading