On Local Limits of Sparse Random Graphs: Color Convergence and the Refined Configuration Model

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Random Graphs, Sparse Random Graphs, Graph Neural Networks
TL;DR: A novel notion of local convergence, color convergence, that characterizes the random graphs on which MPNNs can be learned. A new random graph model that is universal wrt color convergence.
Abstract: Local convergence has emerged as a fundamental tool for analyzing sparse random graph models. We introduce a new notion of local convergence, _color convergence_, based on the Weisfeiler–Leman algorithm. Color convergence fully characterizes the class of random graphs that are well-behaved in the limit for message-passing graph neural networks. Building on this, we propose the _Refined Configuration Model_ (RCM), a random graph model that generalizes the configuration model. The RCM is universal with respect to local convergence among locally tree-like random graph models, including Erdős–Rényi, stochastic block and configuration models. Finally, this framework enables a complete characterization of the random trees that arise as local limits of such graphs.
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 22834
Loading