Sequential Flow Straightening for Generative Modeling

19 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: generative model; flow matching
TL;DR: Sequential reflowing generates straightened flows and achieves better generation quality with low NFE.
Abstract: Even though the continuous-time generative models simulating ODEs and SDEs, such as diffusion models or flow-based models, have achieved great success in tasks such as large-scale image synthesis, generating high-quality samples from those models requires a large number of function evaluations (NFE) of neural networks. One key reason for the slow sampling speed of the ODE-based solvers that simulate these generative models is the high curvature of the ODE trajectory, which explodes the truncation error of the numerical solvers in the low-NFE regime. As straightening the probability flow is the key to fast sampling through the numerical solvers by increasing the tolerance of the solver, existing methods directly generate the joint distribution between the noise and data distribution and learn a linear path between those data pairs. However, this method also suffers from a high truncation error while generating the pair through the full simulation, thus worsening the sampling quality. To address this challenge, We propose a novel method called sequential reflow, a learning technique to straighten the flow that reduces the global truncation error and hence enabling acceleration and improving the synthesis quality. In both theoretical and empirical studies, we first observe the straightening property of our sequential reflow. Via sequential reflow, We achieved FID 3.97 with 8 function evaluations in CIFAR-10 dataset.
Supplementary Material: zip
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1912
Loading