Continuously Tempered PDMP samplersDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 20 Oct 2022, 23:45NeurIPS 2022 AcceptReaders: Everyone
Keywords: PDMP, Zig-Zag, tempering, MCMC, Monte Carlo, Markov Chain, Probabilistic Inference
TL;DR: Tempering methods for PDMPs
Abstract: New sampling algorithms based on simulating continuous-time stochastic processes called piece-wise deterministic Markov processes (PDMPs) have shown considerable promise. However, these methods can struggle to sample from multi-modal or heavy-tailed distributions. We show how tempering ideas can improve the mixing of PDMPs in such cases. We introduce an extended distribution defined over the state of the posterior distribution and an inverse temperature, which interpolates between a tractable distribution when the inverse temperature is 0 and the posterior when the inverse temperature is 1. The marginal distribution of the inverse temperature is a mixture of a continuous distribution on $[0,1)$ and a point mass at 1: which means that we obtain samples when the inverse temperature is 1, and these are draws from the posterior, but sampling algorithms will also explore distributions at lower temperatures which will improve mixing. We show how PDMPs, and particularly the Zig-Zag sampler, can be implemented to sample from such an extended distribution. The resulting algorithm is easy to implement and we show empirically that it can outperform existing PDMP-based samplers on challenging multimodal posteriors.
Supplementary Material: zip
19 Replies

Loading