SubDiff: Subgraph Latent Diffusion Model

19 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: Latent Diffusion Model, Subgraph Learning, Conditional Generative Model
Abstract: Diffusion models have achieved impressive performances on generative tasks in various domains. While numerous approaches are striving to generate feature-rich graphs to advance foundational science research, there are still challenges hindering generating high-quality graphs. First, the discrete geometric property of graphs gains difficulty in capturing complex node-level dependencies for diffusion model. Second, there is still a gap to simultaneously unify unconditional and conditional generation. In this paper, we propose a subgraph latent diffusion model to jointly address above challenges by inheriting the nice property of subgraph. Subgraphs can adapt diffusion process to discrete geometric data by simplifying the complex dependencies between nodes. Besides, subgraph latent embedding with explicit supervision can bridge the gap between unconditional and conditional generation. To this end, we propose a subgraph latent diffusion model (SubDiff) by taking subgraphs as minimum units. Specifically, a novel Subgraph Equivariant Graph Neural Network is proposed to achieve graph equivariance. Then a Head Alterable Sampling strategy (HAS) is devised to allow different sampling routes along diffusion processes, unifying the conditional and unconditional generative learning. Theoretical analysis demonstrate that our training objective is equivalent to optimizing the variational lower bound of log-likelihood. Extensive experiments show SubDiff achieving better performance in both generative schemes.
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 1946
Loading