Expander Graph PropagationDownload PDF

Published: 07 Nov 2022, Last Modified: 05 May 2023NeurReps 2022 OralReaders: Everyone
Keywords: graph neural networks, graph representation learning, graph machine learning, bottlenecks, oversquashing, curvature, expander graphs, cayley graphs, group theory
TL;DR: We alleviate the bottleneck of graph neural networks, by propagating information over an expander family of graphs. We prove this construct alleviates oversquashing, is of subquadratic complexity, and requires no dedicated preprocessing.
Abstract: Deploying graph neural networks (GNNs) on whole-graph classification or regression tasks is challenging, often requiring node features that are mindful of both local interactions and the graph global context. GNN architectures need to avoid pathological behaviours, such as bottlenecks and oversquashing, while ideally having linear time and space complexity requirements. In this work, we propose an elegant approach based on propagating information over expander graphs. We provide an efficient method for constructing expander graphs of a given size, and use this insight to propose the EGP model. We show that EGP is able to address all of the above concerns, while requiring minimal effort to set up, and provide evidence of its empirical utility on relevant datasets and baselines in the Open Graph Benchmark. Importantly, using expander graphs as a template for message passing necessarily gives rise to negative curvature. While this appears to be counterintuitive in light of recent related work on oversquashing, we theoretically demonstrate that negatively curved edges are likely to be required to obtain scalable message passing without bottlenecks.
4 Replies

Loading