When Do Distant Dependencies Matter? Diagnostics for Long-Range Propagation in GNNs

ICLR 2026 Conference Submission18804 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph Neural Networks, Rewiring, Long Range Interaction
Abstract: Graph Neural Networks (GNNs) propagate information locally through message passing. While local propagation is often sufficient for short-range tasks, performance can degrade when distant interactions are required. In this paper, we introduce a diagnostic metric that quantifies the role of long-range propagation. The metric is derived from margin-aligned sensitivities, providing an interpretable measure of the dominance of one-hop neighbors in margin-relevant influence. Using this diagnostic, we show that the need for long-range propagation is dataset- and architecture-dependent, rather than universal. We further demonstrate that this diagnostic metric is predictable from well-studied graph-theoretic measures, aligning with the assumptions of rewiring-based approaches. Finally, we show how the diagnostic can be leveraged during training: we design an additional layer that selectively incorporates sensitivity to long-range dependencies and can be applied to any standard GNN backbone. Experiments on both node- and graph-level benchmarks demonstrate consistent gains over rewiring-based methods, without altering the original graph topology.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 18804
Loading