Towards Principled Design for Graph Neural Networks Through Governing Law of Dynamic Learning Behavior

16 Sept 2025 (modified: 12 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph learning, PDE-informed GNNs, Oversmoothing, Graph Homophily
Abstract: Graph Neural Networks (GNNs) have been extensively evaluated in the machine learning regime. In contrast to most studies that primarily focus on the empirical assessment of model performance across different graph datasets, we ratchet the gear of GNN benchmarking another notch forward to understand the graph learning mechanism that shapes characteristic learning behaviors of each GNN model. Specifically, we introduce a comprehensive benchmark framework `PDEGNN-Bench` to evaluate GNNs derived from six representative governing equations, i.e., partial differential equations (PDEs), for graph heat isotropic/anisotropic diffusion, non-local diffusion, reaction–diffusion, Hamiltonian system, wave transport, and oscillatory synchronization. By linking each GNN model instance to its corresponding governing equation, we establish new insights into the design principle for new GNNs by understanding the relationship between mechanistic interpretations and descriptive learning performance. To that end, we seek to explore two fundamental questions: (1) How well does each governing equation respond to the challenge of over-smoothing in GNNs? (2) How does the homogeneity degree of graph topology influence model performance across PDE families? Taken together, our benchmark provides a systematic evaluation of leading GNN models through the lens of underlying physical mechanisms. Through well-designed experiments, we demonstrate that each family of governing equations exhibits distinct model generalization and interpretability characteristics, offering guidance for designing suitable GNNs for the new graph data.
Primary Area: datasets and benchmarks
Submission Number: 7977
Loading