SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation

Published: 19 Jun 2024, Last Modified: 19 Jun 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Permutation-invariant diffusion models of graphs achieve the invariant sampling and invariant loss functions by restricting architecture designs, which often sacrifice empirical performances. In this work, we first show that the performance degradation may also be contributed by the increasing modes of target distributions brought by invariant architectures since 1) the optimal one-step denoising scores are score functions of Gaussian mixtures models (GMMs) whose components center on these modes and 2) learning the scores of GMMs with more components is often harder. Motivated by the analysis, we propose SwinGNN along with a simple yet provable trick that enables permutation-invariant sampling. It benefits from more flexible (non-invariant) architecture designs and permutation-invariant sampling. We further design an efficient 2-WL message passing network using the shifted-window self-attention. Extensive experiments on synthetic and real-world protein and molecule datasets show that SwinGNN outperforms existing methods by a substantial margin on most metrics. Our code is released at https://github.com/qiyan98/SwinGNN.
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/qiyan98/SwinGNN
Supplementary Material: zip
Assigned Action Editor: ~Guillaume_Rabusseau1
Submission Number: 2269