GNN-Guided Block Selection in Gibbs MCMC

NeurIPS 2025 Workshop FPI Submission84 Authors

Published: 23 Sept 2025, Last Modified: 30 Nov 2025FPI-NEURIPS2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: Main Track
Keywords: Bayesian Networks, probabilistic graphical models, posterior inference, approximate inference, Bayesian inference, probabilistic inference, inference, Markov Chain Monte Carlo, MCMC, Gibbs sampling, blocked Gibbs sampling, block selection, spectral gap, mixing time, Graph Neural Networks, GNNs
TL;DR: Training GNNs to propose good blocks for Gibbs MCMC sampling in highly coupled bayesian networks to amortise posterior inference
Abstract: Exact inference in large Bayesian Networks (BNs) is computationally intractable, limiting its practical application. Markov Chain Monte Carlo (MCMC) methods like Gibbs sampling offer a scalable alternative but can be arbitrarily slowed by highly coupled variables--- addressable by jointly sampling some variables as a block. We propose an automated block detection method to amortise inference time: training a Graph Neural Network (GNN) to propose blocks directly from the BN structure. We further introduce a novel coupling heuristic based on the Markov chain's spectral gap, which we show can be more robust than existing heuristics. Our GNN, trained on a dataset of small, randomly generated BNs, generalizes well to larger networks, accelerating MCMC sample efficiency in our experiments.
Submission Number: 84
Loading