Track: long paper (up to 8 pages)
Keywords: Oversquashing, Effective resistance, Graph rewiring, GNN
TL;DR: Our work mitigates GNN oversquashing by rewiring graphs using effective resistance to add edges across bottlenecks, improving long-range information flow while managing oversmoothing.
Abstract: Graph Neural Networks (GNNs) struggle to capture long-range dependencies due to over-squashing, where information from exponentially growing neighborhoods must pass through a small number of structural bottlenecks. While recent rewiring methods attempt to alleviate this limitation, many rely on local criteria such as curvature, which can overlook global connectivity bottlenecks that restrict information flow.
We introduce Effective Resistance Rewiring (ERR), a simple topology correction strategy that uses effective resistance as a global signal to detect structural bottlenecks. ERR iteratively adds edges between node pairs with the largest resistance while removing edges with minimal resistance, strengthening weak communication pathways while controlling graph densification through a fixed edge budget. The procedure is parameter-free beyond the rewiring budget and relies on a single global measure aggregating all paths between node pairs.
Beyond evaluating predictive performance on GCN model, we analyze how rewiring affects message propagation. By studying cosine similarity between node embeddings across layers, we study how the relationship between initial node features and learned representations evolves during message passing, comparing graphs with and without rewiring. his analysis helps determine whether performance gains arise from improved long-range communication.
Experiments on homophilic (Cora, CiteSeer) and heterophilic (Cornell, Texas) graphs, including directed settings with DirGCN, reveal a fundamental trade-off between over-squashing and oversmoothing, losing representation diversity across layers. Resistance-guided rewiring improves connectivity and signal propagation but can accelerate representation mixing in deep models. Combining ERR with normalization techniques (e.g., PairNorm) stabilizes this trade-off and improves performance, particularly in heterophilic settings.
Anonymization: This submission has been anonymized for double-blind review via the removal of identifying information such as names, affiliations, and identifying URLs.
Submission Number: 120
Loading