Keywords: Graph Generation, Discrete Flow Models, Molecule Generation, Permutation Equivariance, Flow Matching
TL;DR: We propose DeFoG, a novel and flexible discrete flow-based framework for efficient graph generation, achieving state-of-the-art performance across synthetic and molecular datasets.
Abstract: Graph generation is fundamental in diverse scientific applications, due to its ability to reveal the underlying distribution of complex data, and eventually generate new, realistic data points.
Despite the success of diffusion models in this domain, those face limitations in sampling efficiency and flexibility, stemming from the tight coupling between the training and sampling stages.
To address this, we propose DeFoG, a novel framework using discrete flow matching for graph generation. DeFoG employs a flow-based approach that features an efficient linear interpolation noising process and a flexible denoising process based on a continuous-time Markov chain formulation.
We leverage an expressive graph transformer and ensure desirable node permutation properties to respect graph symmetry.
Crucially, our framework enables a disentangled design of the training and sampling stages, enabling more effective and efficient optimization of model performance.
We navigate this design space by introducing several algorithmic improvements that boost the model performance, consistently surpassing existing diffusion models.
We also theoretically demonstrate that, for general discrete data, discrete flow models can faithfully replicate the ground truth distribution - a result that naturally extends to graph data and reinforces DeFoG's foundations.
Extensive experiments show that DeFoG achieves state-of-the-art results on synthetic and molecular datasets, improving both training and sampling efficiency over diffusion models, and excels in conditional generation on a digital pathology dataset.
Supplementary Material: zip
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1816
Loading