EDGE++: Improved Training and Sampling of EDGE

Published: 28 Oct 2023, Last Modified: 21 Dec 2023NeurIPS 2023 GLFrontiers Workshop PosterEveryoneRevisionsBibTeX
Keywords: diffusion model, graph generation
TL;DR: We introduce enhancements to the EDGE graph-generative model, optimizing computational efficiency and generative accuracy for large-scale graphs.
Abstract: Traditional graph-generative models like the Stochastic-Block Model (SBM) fall short in capturing complex structures inherent in large graphs. Recently developed deep learning models like NetGAN, CELL, and Variational Graph Autoencoders have made progress but face limitations in replicating key graph statistics. Diffusion-based methods such as EDGE have emerged as promising alternatives, however, they present challenges in computational efficiency and generative performance. In this paper, we propose enhancements to the EDGE model to address these issues. Specifically, we introduce a degree-specific noise schedule that optimizes the number of active nodes at each timestep, significantly reducing memory consumption. Additionally, we present an improved sampling scheme that fine-tunes the generative process, allowing for better control over the similarity between the synthesized and the true network. Our experimental results demonstrate that the proposed modifications not only improve the efficiency but also enhance the accuracy of the generated graphs, offering a robust and scalable solution for graph generation tasks.
Submission Number: 5
Loading