Efficient Spectral Graph Diffusion based on Symmetric Normalized Laplacian

ICLR 2026 Conference Submission25510 Authors

20 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Efficient Graph Generation, Spectral Diffusion, Eigenvalue Normalization
Abstract: Graph distribution learning and generation are fundamental challenges with applications in drug discovery, materials science, and network analysis. While diffusion-based approaches have shown promise, existing spectral methods suffer from eigenvalue imbalance and limited scalability. We introduce Efficient Spectral Graph Diffusion (ESGD), which advances spectral graph generation in three key ways: (1) compressing eigenvalues of the Symmetric Normalized Laplacian (SNL) into a bounded domain to eliminate spectrum imbalance with theoretical convergence guarantees; (2) designing a degree-matrix recovery algorithm to reconstruct adjacency matrices from SNL representations; (3) scaling to graphs with thousands of nodes where other models fail. The SNL transformation reduces condition numbers and learning difficulty for complex distribution patterns. Empirically, ESGD achieves state-of-the-art performance on generic graphs and competitive results on molecular generation, while successfully extending to large graphs. ESGD converges in 20 epochs (vs. >2000 for baselines) with 6-10× fewer sampling steps, establishing an efficient foundation for spectral graph diffusion.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 25510
Loading