Efficient Bayesian Sampling Using Normalizing Flows to Assist Markov Chain Monte Carlo MethodsDownload PDF

Published: 15 Jun 2021, Last Modified: 05 May 2023INNF+ 2021 contributedtalkReaders: Everyone
Keywords: MCMC, normalizing flows, Bayesian inference
TL;DR: We present a concurrent scheme where a normalizing flow is used to speed-up a MCMC scheme and the data from the MCMC is used to train the flow, with applications to Bayesian posterior distribution sampling.
Abstract: Normalizing flows can generate complex target distributions and thus show promise in many applications in Bayesian statistics as an alternative or complement to MCMC for sampling posteriors. Since no data set from the target posterior distribution is available beforehand, the flow is typically trained using the reverse Kullback-Leibler (KL) divergence that only requires samples from a base distribution. This strategy may perform poorly when the posterior is complicated and hard to sample with an untrained normalizing flow. Here we explore a distinct training strategy, using the direct KL divergence as loss, in which samples from the posterior are generated by (i) assisting a local MCMC algorithm on the posterior with a normalizing flow to accelerate its mixing rate and (ii) using the data generated this way to train the flow. The method only requires a limited amount of \textit{a~priori} input about the posterior, and can be used to estimate the evidence required for model validation, as we illustrate on examples.
4 Replies

Loading