Keywords: Diffusion Models, Score Matching, Conditional Sampling, Probabilistic Reasoning, Tractability, Energy-based Models, Fisher Divergence
TL;DR: We propose DISCO, a method for training score-based models without learning diffused data distributions, enabling accurate conditional sampling.
Abstract: The success of score-based models largely stems from the idea of denoising a diffusion process given by a collection of time-indexed score fields.
While diffusion-based models have achieved impressive results in sample generation, leveraging them for sound probabilistic inference — particularly for sampling from *arbitrary conditional distributions* — remains challenging.
Briefly, this difficulty arises because conditioning information is only observed for clean data and not available for higher noise levels, which would be required for generating exact conditional samples.
In this paper, we introduce an effective approach to *DIffusion-free SCOre matching* (DISCO), which sidesteps the need for time-dependent score fields altogether.
Our method is based on a principled objective that estimates only the score of the (slightly perturbed) data distribution.
In our experiments, score models learned with DISCO are competitive with state-of-the-art diffusion models in terms of sample quality.
More importantly, DISCO yields a more faithful representation of the underlying data distribution and — crucially — enables sampling from arbitrary conditional distributions.
This capability opens the door to sound and flexible probabilistic reasoning with score-based models.
Submission Number: 21
Loading