Graph Neural Differential Equations in the Infinite‑Node Limit: Convergence and Rates via Graphon Theory

Published: 23 Sept 2025, Last Modified: 27 Oct 2025NPGML PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Neural ODEs, Graphs, Graphons, Continuous Limits, Convergence Rates
Abstract: Graph Neural Differential Equations (GNDEs) combine the structural inductive bias of Graph Neural Networks (GNNs) with the continuous-depth architecture of Neural ODEs, offering an effective framework for modeling dynamics on graphs. In this paper, we present the first rigorous convergence analysis of GNDEs with time-varying parameters in the infinite-node limit, providing theoretical insights into their \textit{size transferability}. We introduce Graphon Neural Differential Equations (Graphon-NDEs) as the infinite-node limit of GNDEs and establish their well-posedness. Leveraging tools from graphon theory and dynamical systems, we prove the \textit{trajectory-wise} convergence of GNDE solutions to Graphon-NDE solutions. Moreover, we derive explicit convergence rates for GNDEs over weighted graphs sampled from Lipschitz-continuous graphons and unweighted graphs sampled from {0,1}-valued (discontinuous) graphons. We further obtain size transferability bounds, providing theoretical justification for the practical strategy of transferring GNDE models trained on moderate-sized graphs to larger, structurally similar graphs without retraining. Numerical experiments support our theoretical findings.
Submission Number: 112
Loading