Analyzing GFlowNets: Stability, Expressiveness, and Assessment

Published: 17 Jun 2024, Last Modified: 02 Jul 20242nd SPIGM @ ICML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: GFlowNets
Abstract: Generative Flow Networks (GFlowNets) are powerful samplers for distributions over compositional objects (e.g., graphs). In this work, we analyze GFlowNets from three fundamental perspectives: stability, expressiveness, and assessment. For stability, we analyze how fluctuations in balance conditions impact the accuracy of GFlowNets. Our theoretical results suggest that i) the effect of balance violations is heterogeneous across the state graph and ii) each node's influence on GFlowNet's accuracy is tied to the reward associated with its descendants. We leverage these insights to propose a weighted balance loss that leads to faster training convergence. Regarding expressiveness, we consider GFlowNets for graph generation. We prove that, given a suitable state graph, GFlowNets can accurately learn any distribution supported over trees. Strikingly, however, we show simple combinations of state graphs and reward functions that cause GFlowNets to fail, i.e., for which balance is unattainable. We propose leveraging embeddings of children's states to circumvent this limitation and thus increase the expressiveness of GFlowNets, provably. Lastly, we propose a theoretically sound and computationally tractable metric for assessing GFlowNets. We experimentally show it is a better proxy for distributional correctness than popular evaluation protocols.
Submission Number: 71
Loading