A Generative Framework for Exchangeable Graphs with Global and Local Latent Structure

Published: 23 Sept 2025, Last Modified: 21 Oct 2025NPGML PosterEveryoneRevisionsBibTeXCC BY-NC 4.0
Keywords: VGAE, Set Transformer, Diffusion, Latent space models, LFR Benchmark, Graph Generative models, Exchangeable Graphs
TL;DR: Set transformer-based generative model for exchangeable graphs, capturing local and global structure via Gaussian mixture and latent diffusion, accurately reproducing graph statistics on synthetic and realistic benchmarks.
Abstract: We introduce a generative framework for exchangeable graphs that combines a Set Transformer-based encoder-decoder architecture with a hierarchical latent space composed of a global variable and node-specific variables. The global latent is modeled via diffusion process and acts as contextual input for node-level Gaussian mixtures. The decoder uses self-attention layers with global context injection to predict edge probabilities, ensuring high expressivity and full permutation invariance. The overall architecture ensures permutation invariance and can operate without node features, using the information in the adjacency matrix, which enables broad applicability beyond feature-rich domains. Through extensive experiments on synthetic benchmarks—including SBM, MMSBM, and the realistic LFR benchmark—we show that our approach accurately reproduces key graph statistics, especially for community-based networks. The global and local latent variables provide meaningful graph and node level context, while the architecture remains scalable to medium-sized dense graphs (e.g. 500-1000 nodes). Overall, our framework balances expressiveness, interpretability, and structural fidelity, offering a versatile tool for modeling complex graph data.
Submission Number: 124
Loading