SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation

TMLR Paper2269 Authors

20 Feb 2024 (modified: 01 Mar 2024)Under review for TMLREveryoneRevisionsBibTeX
Abstract: Permutation-invariant diffusion models of graphs achieve the invariant sampling and invariant loss functions by restricting architecture designs, which often sacrifice empirical performances. In this work, we first show that the performance degradation may also be contributed by the increasing modes of target distributions brought by invariant architectures since 1) the optimal one-step denoising scores are score functions of Gaussian mixtures models (GMMs) whose components center on these modes and 2) learning the scores of GMMs with more components is often harder. Motivated by the analysis, we propose SwinGNN along with a simple yet provable trick that enables permutation-invariant sampling. It benefits from more flexible (non-invariant) architecture designs and permutation-invariant sampling. We further design an efficient 2-WL message passing network using the shifted-window self-attention. Extensive experiments on synthetic and real-world protein and molecule datasets show that SwinGNN outperforms existing methods by a substantial margin on most metrics.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Guillaume_Rabusseau1
Submission Number: 2269
Loading