Expander Graph PropagationDownload PDF

24 Sept 2022, 19:02 (modified: 22 Nov 2022, 06:46)NeurIPS 2022 GLFrontiers WorkshopReaders: Everyone
Keywords: graph neural networks, graph representation learning, graph machine learning, bottlenecks, oversquashing, curvature, expander graphs, cayley graphs, group theory
TL;DR: We alleviate the bottleneck of graph neural networks, by propagating information over an expander family of graphs. We prove this construct alleviates oversquashing, is of subquadratic complexity, and requires no dedicated preprocessing.
Abstract: Deploying graph neural networks (GNNs) on whole-graph classification or regression tasks is known to be challenging: it often requires computing node features that are mindful of both local interactions in their neighbourhood and the global context of the graph structure. GNN architectures that navigate this space need to avoid pathological behaviours, such as bottlenecks and oversquashing, while ideally having linear time and space complexity requirements. In this work, we propose an elegant approach based on propagating information over expander graphs. We provide an efficient method for constructing expander graphs of a given size, and use this insight to propose the EGP model. We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up, and provide evidence of its empirical utility on relevant datasets and baselines in the Open Graph Benchmark. Importantly, using expander graphs as a template for message passing necessarily gives rise to negative curvature. While this appears to be counterintuitive in light of recent related work on oversquashing, we theoretically demonstrate that negatively curved edges are likely to be required to obtain scalable message passing without bottlenecks. To the best of our knowledge, this is a previously unstudied result in the context of graph representation learning, and we believe our analysis paves the way to a novel class of scalable methods to counter oversquashing in GNNs.
1 Reply