Scalable Sampling via Generalized Fixed-Point Diffusion Matching
Keywords: Sampling methods, diffusion models, variational methods
TL;DR: We propose a stable, unified objective for learning stochastic transport maps between arbitrary distributions by treating diffusion-based sampling as a fixed-point iteration rooted in Nelson’s relation
Abstract: Sampling from unnormalized densities using diffusion models has emerged as a powerful paradigm. However, while recent approaches that use least-squares `matching' objectives have improved scalability, they often necessitate significant trade-offs, such as restricting prior distributions or relying on unstable optimization schemes. By generalizing these methods as special forms of fixed-point iterations rooted in Nelson's relation, we develop a new method that addresses these limitations. Our approach enables learning a stochastic transport map between arbitrary prior and target distributions with a single, scalable, and stable objective. Furthermore, we introduce a damped variant of this iteration that incorporates a regularization term to mitigate mode collapse. Empirically, we demonstrate that our method enables sampling at unprecedented scales while preserving mode diversity, achieving state-of-the-art results on complex synthetic densities and high-dimensional molecular benchmarks.
Submission Number: 123
Loading