Keywords: discrete generative models, discrete flow matching, discrete diffusion, few-step distillation
TL;DR: Rectified Flow for Discrete Flow-based Models
Abstract: Discrete Flow-based Models (DFMs) are powerful generative models for high-quality discrete data but typically suffer from slow sampling speeds due to their reliance on iterative decoding processes.
This reliance on a multi-step process originates from the factorization approximation of DFMs, which is necessary for handling high-dimensional data.
In this paper, we analyze the factorization approximation error using Conditional Total Correlation (TC), and reveal its dependence on the coupling.
To address the challenge of efficient few-step generation, we propose Rectified Discrete Flow (ReDi), a novel iterative method that reduces the underlying factorization error (measured as Conditional TC) by rectifying the coupling between source and target distributions.
We theoretically prove that each ReDi step guarantees a monotonic decreasing Conditional TC, ensuring its convergence.
Empirically, ReDi significantly reduces Conditional TC and enables few-step generation.
Moreover, we demonstrate that the rectified couplings are well-suited for training efficient one-step models on image generation.
ReDi offers a simple and theoretically grounded approach for tackling the few-step challenge, providing a new perspective on efficient discrete data synthesis.
Code is available at https://github.com/Ugness/ReDi_discrete.
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 9307
Loading