S4G: Breaking the Bottleneck on Graphs with Structured State Spaces

24 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: learning on graphs and other geometries & topologies
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: GNN, over-squashing, state-space models
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Abstract: The majority of GNNs are based on message-passing mechanisms, however, message-passing neural networks (MPNN) have inherent limitations in capturing long-range interactions. The exponentially growing node information is compressed into fixed-size representations through multiple rounds of message passing, bringing the over-squashing problem, which severely hinders the flow of information on the graph and creates a bottleneck in graph learning. The natural idea of introducing global attention to point-to-point communication, as adopted in graph Transformers (GT), lacks inductive biases on graph structures and relies on complex positional encodings to enhance their performance in practical tasks. In this paper, we observe that the sensitivity between nodes in MPNN decreases exponentially with the shortest path distance. Contrarily, GT has a constant sensitivity, which leads to its loss of inductive bias. To address these issues, we introduce structured state spaces to capture the hierarchical structure of rooted-trees, achieving linear sensitivity with theoretical guarantees. We further propose a novel graph convolution based on the state-space model, resulting in a new paradigm that retains both the strong inductive biases from MPNN and the long-range modeling capabilities from GT. Extensive experimental results on long-range and general graph benchmarks demonstrate the superiority of our approach.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9207
Loading