Single-Step Consistent Diffusion Samplers

11 May 2025 (modified: 29 Oct 2025)Submitted to NeurIPS 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Efficient Sampling, Unnormalized Distributions, Consistency, Controlled SDEs
TL;DR: We propose consistent diffusion samplers that enable single-step sampling from unnormalized target distributions.
Abstract: Sampling from unnormalized target distributions is a fundamental yet challenging task in machine learning and statistics. Existing sampling algorithms typically require many iterative steps to produce high-quality samples, leading to high computational costs that limit their practicality in time-sensitive or resource-constrained settings. In this work, we introduce *consistent diffusion samplers*, a new class of samplers designed to generate high-fidelity samples in a single step. We first propose Consistency-Distilled Diffusion Samplers (CDDS), which demonstrates that consistency distillation can be accomplished within sampling contexts in the absence of pre-collected training datasets. To eliminate the need for a pre-trained sampler, we further propose Self-Consistent Diffusion Samplers (SCDS), which performs self-distillation during training. SCDS learns to perform diffusion sampling and to skip intermediate steps via a self-consistency loss. Through extensive experiments on a variety of synthetic and real-world unnormalized distributions, we show that our approaches yield high-fidelity samples using less than 1% of the network evaluations required by traditional diffusion samplers.
Supplementary Material: zip
Primary Area: Probabilistic methods (e.g., variational inference, causal inference, Gaussian processes)
Submission Number: 23873
Loading