Permutation Equivariant Neural Controlled Differential Equations for Dynamic Graph Representation Learning

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Temporal Graph Representation Learning, Neural Differential Equations, Equivariance Theory, Continuous Graph Neural Networks, Dynamical Systems
TL;DR: We propose Permutation Equivariant Graph Neural CDEs, an equivariant and parameter-efficient extension of Graph Neural CDEs for dynamic graph representation learning.
Abstract: Dynamic graphs exhibit complex temporal dynamics due to the interplay between evolving node features and changing network structures. Recently, Graph Neural Controlled Differential Equations (Graph Neural CDEs) successfully adapted Neural CDEs from paths on Euclidean domains to paths on graph domains. Building on this foundation, we introduce \textit{Permutation Equivariant Graph Neural CDEs}, which project Graph Neural CDEs onto permutation equivariant function spaces. This significantly reduces the model's parameter count without compromising representational power, resulting in more efficient training and improved generalisation. We empirically demonstrate the advantages of our approach through experiments on simulated dynamical systems and real-world tasks, showing improved performance in both interpolation and extrapolation scenarios.
Primary Area: General machine learning (supervised, unsupervised, online, active, etc.)
Submission Number: 22260
Loading