Keywords: Graph generation, Autoregressive model, Generative modeling
TL;DR: A novel autoregressive graph generative model based on graph filtration
Abstract: Graph generative models often face a critical trade-off between learning complex distributions and achieving fast generation speed. We introduce Autoregressive Filtration Modeling (AFM), a novel approach that addresses both challenges. AFM leverages filtration, a concept from topological data analysis, to transform graphs into short sequences of monotonically increasing subgraphs. This enables a structured autoregressive generation process, contrasting with the stochastic trajectories of diffusion models. We propose a novel autoregressive graph mixer model to learn this filtration process, coupled with a noise augmentation strategy to mitigate exposure bias and a reinforcement learning approach to refine the generative model. Extensive experiments on diverse synthetic and real-world datasets demonstrate AFM's superior performance compared to existing autoregressive models. Additionally, AFM achieves a 100-fold speedup in generation time compared to state-of-the-art diffusion models while maintaining the quality of generated graphs. This work represents a significant advancement towards high-throughput graph generation.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 11227
Loading