Explicit Flow Matching: On The Theory of Flow Matching Algorithms with Applications

15 May 2024 (modified: 06 Nov 2024)Submitted to NeurIPS 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Flow Matching, Deep Learning Theory, Generative modeling, Variance Reduction, Stochastic Differential Equations
TL;DR: The paper introduces Explicit Flow Matching (ExFM) as an analytical method for analysing and training flow-based generative models, which demonstrates reduced variance and improved stability as well as exact expressions for vector fields and scores.
Abstract: This paper proposes a novel method, Explicit Flow Matching (ExFM), for training and analyzing flow-based generative models. ExFM leverages a theoretically grounded loss function, ExFM loss (a tractable form of Flow Matching (FM) loss), to demonstrably reduce variance during training, leading to faster convergence and more stable learning. Based on theoretical analysis of these formulas, we derived exact expressions for the vector field (and score in stochastic cases) for model examples (in particular, for separating multiple exponents), and in some simple cases, exact solutions for trajectories. In addition, we also investigated simple cases of Diffusion Generative Models by adding a stochastic term and obtained an explicit form of the expression for score. While the paper emphasizes the theoretical underpinnings of ExFM, it also showcases its effectiveness through numerical experiments on various datasets, including high-dimensional ones. Compared to traditional FM methods, ExFM achieves superior performance in terms of both learning speed and final outcomes.
Primary Area: Learning theory
Submission Number: 17890
Loading