Keywords: Graph Neural Networks, Schreier Cosets, Over-squashing, Deep Learning
TL;DR: SCGR rewires graphs using Schreier-Cosets structures to reduce effective resistance, enabling stronger long-range information flow and mitigating over-squashing in GNNs with scaleable efficiency.
Abstract: Graph Neural Networks (GNNs) provide a principled framework for learning on graph-structured data, yet their expressiveness is fundamentally limited by over-squashing-the exponential compression of information from distant nodes into fixed size vectors. While graph rewiring methods attempt to alleviate this issue by modifying topology, existing approaches can introduce prohibitive computational bottlenecks. We propose Schreier-Coset Graph Rewiring (SCGR), a group-theoretic rewiring method that augments the input graph with a Schreier-coset graph derived from a special linear group $\mathrm{SL}(2,\mathbb{Z}_n)$. Unlike heuristic rewiring, SCGR provides $\textit{provable}$ theoretical guarantees: the auxiliary graph exhibits a spectral gap and a bounded effective resistance, creating low-resistance bypasses for long-range communication. By coupling these two graphs with strength, we ensure that effective resistance between any node pair is bounded, directly mitigating over-squashing. Empirical evaluations demonstrate that SCGR reduces effective resistance by 15-40\% across benchmark datasets while maintaining competitive accuracy and lower computational overhead, making it practical for both large-scale and diverse applications.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 18585
Loading