LEAPS: A discrete neural sampler via locally equivariant networks

Published: 06 Mar 2025, Last Modified: 24 Apr 2025FPI-ICLR2025 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: CTMC, Bayesian inference, MCMC, sampling
TL;DR: A discrete neural sampler for unnormalized densities via locally equivariant networks.
Abstract: We propose LEAPS, an algorithm to sample from discrete distributions known up to normalization by learning a rate matrix of a continuous-time Markov chain (CTMC). The method can be seen as a continuous-time formulation of annealed importance sampling and sequential Monte Carlo methods, extended so that the variance of the importance weights is offset by the inclusion of the CTMC. To derive these importance weights, we introduce a set of Radon-Nikodym derivatives of CTMCs over their path measures. Because the computation of these weights is intractable with standard neural network parameterizations of rate matrices, we devise a new compact representation for rate matrices via what we call \textit{locally equivariant} functions. To parameterize them, we introduce a family of locally equivariant multilayer perceptrons, attention layers, and convolutional networks, and provide an approach to make deep networks that preserve the local equivariance. This property allows us to propose a scalable training algorithm such that the variance of the importance weights associated to the CTMC are minimal. We demonstrate the efficacy of our method on problems in statistical physics.
Submission Number: 87
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview