Optimizing Markov Chain Monte Carlo Convergence with Normalizing Flows and Gibbs Sampling

Published: 28 Oct 2023, Last Modified: 20 Nov 2023NeurIPS2023-AI4Science PosterEveryoneRevisionsBibTeX
Keywords: MCMC, normalizing flow, Monte Carlo, sampling
TL;DR: GflowMC improves MCMC sampling with normalizing flows by employing a Metropolis-within-Gibbs partial update scheme in latent space.
Abstract: Generative models have started to integrate into the scientific computing toolkit. One notable instance of this integration is the utilization of normalizing flows (NF) in the development of sampling and variational inference algorithms. This work introduces a novel algorithm, GflowMC, which relies on a Metropolis-within-Gibbs framework within the latent space of NFs. This approach addresses the challenge of vanishing acceptance probabilities often encountered when using NF-generated independent proposals, while retaining non-local updates, enhancing its suitability for sampling multi-modal distributions. We assess GflowMC's performance concentrating on the $\phi^4$ model from statistical mechanics. Our results demonstrate that by identifying an optimal size for partial updates, convergence of the Markov Chain Monte Carlo (MCMC) can be achieved faster than with full updates. Additionally, we explore the adaptability of GflowMC for biasing proposals towards increasing the update frequency of critical coordinates, such as coordinates highly correlated to mode switching in multi-modal targets.
Submission Track: Original Research
Submission Number: 123
Loading