Learning Rate Free Sampling in Constrained Domains

Published: 21 Sept 2023, Last Modified: 26 Dec 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Sampling, Particle Based Variational Inference, Bayesian Inference, Wasserstein Gradient Descent, Coin Betting, Constrained Domains
TL;DR: We introduce new algorithms for sampling in constrained domains which are learning rate free.
Abstract: We introduce a suite of new particle-based algorithms for sampling in constrained domains which are entirely learning rate free. Our approach leverages coin betting ideas from convex optimisation, and the viewpoint of constrained sampling as a mirrored optimisation problem on the space of probability measures. Based on this viewpoint, we also introduce a unifying framework for several existing constrained sampling algorithms, including mirrored Langevin dynamics and mirrored Stein variational gradient descent. We demonstrate the performance of our algorithms on a range of numerical examples, including sampling from targets on the simplex, sampling with fairness constraints, and constrained sampling problems in post-selection inference. Our results indicate that our algorithms achieve competitive performance with existing constrained sampling methods, without the need to tune any hyperparameters.
Supplementary Material: zip
Submission Number: 4792
Loading