Convergences guarantees of GFlowNets

NeurIPS 2025 Workshop FPI Submission65 Authors

Published: 23 Sept 2025, Last Modified: 25 Nov 2025FPI-NEURIPS2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: Main Track
Keywords: convergence, gflownet, generative flow networks, trajectory balance
TL;DR: We show theoretically that the Trajectory Balance loss, widely used in the GFlowNets literature, guarantees that the distribution induced by the GFlowNet gets closer to the target distribution as it is minimized.
Abstract: Although they were introduced to approximate complex distributions defined up to normalization, Generative Flow Networks (GFlowNets) only provide strong guarantees once idealized conditions are matched. However, these conditions are never satisfied exactly in practice when they are trained using gradient-based methods. In this paper, we prove that minimizing the Trajectory Balance loss, a popular GFlowNet objective, does lead to an induced distribution getting closer to the target distribution of interest, confirming theoretically this long-standing intuition from the GFlowNet literature. We ultimately show that the KL divergence between both distributions is upper-bounded by the quantity being minimized, and we further verify this theoretical statement on a simple sampling task.
Submission Number: 65
Loading