Efficient constrained sampling via the mirror-Langevin algorithmDownload PDF

21 May 2021, 20:43 (modified: 25 Oct 2021, 18:06)NeurIPS 2021 PosterReaders: Everyone
Keywords: Langevin MCMC, Sampling, Optimization, mirror-Langevin
TL;DR: We propose and analyze a sampling algorithm that is a direct analog of mirror descent from optimization and show its efficacy in constrained sampling applications.
Abstract: We propose a new discretization of the mirror-Langevin diffusion and give a crisp proof of its convergence. Our analysis uses relative convexity/smoothness and self-concordance, ideas which originated in convex optimization, together with a new result in optimal transport that generalizes the displacement convexity of the entropy. Unlike prior works, our result both (1) requires much weaker assumptions on the mirror map and the target distribution, and (2) has vanishing bias as the step size tends to zero. In particular, for the task of sampling from a log-concave distribution supported on a compact set, our theoretical results are significantly better than the existing guarantees.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
10 Replies