everyone
since 13 Oct 2023">EveryoneRevisionsBibTeX
Diffusion models have achieved impressive performances on generative tasks in various domains. While numerous approaches are striving to generate feature-rich graphs to advance foundational science research, there are still challenges hindering generating high-quality graphs. First, the discrete geometric property of graphs gains difficulty in capturing complex node-level dependencies for diffusion model. Second, there is still a gap to simultaneously unify unconditional and conditional generation. In this paper, we propose a subgraph latent diffusion model to jointly address above challenges by inheriting the nice property of subgraph. Subgraphs can adapt diffusion process to discrete geometric data by simplifying the complex dependencies between nodes. Besides, subgraph latent embedding with explicit supervision can bridge the gap between unconditional and conditional generation. To this end, we propose a subgraph latent diffusion model (SubDiff) by taking subgraphs as minimum units. Specifically, a novel Subgraph Equivariant Graph Neural Network is proposed to achieve graph equivariance. Then a Head Alterable Sampling strategy (HAS) is devised to allow different sampling routes along diffusion processes, unifying the conditional and unconditional generative learning. Theoretical analysis demonstrate that our training objective is equivalent to optimizing the variational lower bound of log-likelihood. Extensive experiments show SubDiff achieving better performance in both generative schemes.