The Effectiveness of Curvature-Based Rewiring and the Role of Hyperparameters in GNNs Revisited

Published: 16 Nov 2024, Last Modified: 26 Nov 2024LoG 2024 OralEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Geometric deep learning, Graph Neural Networks, Graph Rewiring, Curvature
Abstract: Message passing is the dominant paradigm in Graph Neural Networks. Its efficiency, however, can be limited by the graph's topology, as information is lost during propagation due to being \textit{oversquashed} when travelling through bottlenecks. Recent efforts have therefore focused on rewiring techniques, which disconnect the input graph originating from the data and the computational graph. A prominent approach for this is to use discrete graph curvature measures to identify and rewire around bottlenecks. In this work, we reevaluate the performance gains that curvature-based rewiring brings to non-synthetic datasets. We show that edges selected during rewiring do not satisfy theoretical criteria identifying bottlenecks, implying that they do not necessarily oversquash information. Subsequently, we demonstrate that reported accuracies after rewiring on these datasets are outliers originating from sweeps of training and rewiring hyperparameters, instead of consistent performance gains. In conclusion, our analysis nuances the effectiveness of curvature-based rewiring in real-world datasets and brings a new perspective on the methods to evaluate GNN accuracy improvements.
Submission Type: Extended abstract (max 4 main pages).
Software: https://github.com/FloTori/Revisiting-Graph-Rewiring
Poster: png
Poster Preview: png
Submission Number: 30
Loading