MCMC Variational Inference via Uncorrected Hamiltonian AnnealingDownload PDF

21 May 2021, 20:41 (modified: 30 Oct 2021, 13:17)NeurIPS 2021 PosterReaders: Everyone
Keywords: Variational Inference, Annealed Importance Sampling, MCMC, Hamiltonian Monte Carlo, differentiable
TL;DR: We introduce a new method combining VI and HMC that yields tighter and differentiable lower bounds on the marginal likelihood.
Abstract: Given an unnormalized target distribution we want to obtain approximate samples from it and a tight lower bound on its (log) normalization constant log Z. Annealed Importance Sampling (AIS) with Hamiltonian MCMC is a powerful method that can be used to do this. Its main drawback is that it uses non-differentiable transition kernels, which makes tuning its many parameters hard. We propose a framework to use an AIS-like procedure with Uncorrected Hamiltonian MCMC, called Uncorrected Hamiltonian Annealing. Our method leads to tight and differentiable lower bounds on log Z. We show empirically that our method yields better performances than other competing approaches, and that the ability to tune its parameters using reparameterization gradients may lead to large performance improvements.
Supplementary Material: pdf
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Code: zip
13 Replies