Efficient Spectral Graph Diffusion based on Symmetric Normalized Laplacian

ICLR 2026 Conference Submission25510 Authors

20 Sept 2025 (modified: 23 Dec 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Efficient Graph Generation, Spectral Diffusion, Eigenvalue Normalization
Abstract: Graph generative modeling has seen rapid progress, yet existing approaches often trade off between fidelity, scalability, and stability. Continuous and discrete diffusion models capture complementary aspects but remain hampered by either structural distortion or heavy computational costs. We introduce Efficient Spectral Graph Diffusion (ESGD), a lightweight framework that performs diffusion in the compressed eigenvalue space of the Symmetric Normalized Laplacian (SNL). This spectral compression guarantees bounded eigenvalues, provable stability, and faster convergence while eliminating hub-node dominance. A novel degree-matrix recovery algorithm enables exact graph reconstruction from the spectral representation. ESGD achieves state-of-the-art generation quality with one of the smallest parameter counts, converging up to 100× faster in training and requiring 6–10× fewer sampling steps with up to 2000× less computational cost. Our findings suggest that progress in graph generation may come less from heavier engineering, and more from principled reformulations that unlock both efficiency and fidelity.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 25510
Loading