TL;DR: A space-and-time-efficient GNN with infinite receptive fields is proposed, outperforming SotA baselines on homophilic, heterophilic, and long-range graph benchmarks.
Abstract: The graph attention (GAT) mechanism has been instrumental in enabling nodes to aggregate information from their neighbours based on relevancies, significantly enhancing the adaptiveness of graph neural networks across various graph representation learning tasks.
Recent research has sought to further leverage the power of GAT for tasks that require capturing long-range data dependencies.
However, the conventional stacking of GAT layers leads to excessive memory footprint, computation overhead, and the issue of over-smoothing.
To address these challenges, this study proposes Attentive Long-Short-range message passing (ALS), which integrates personalized PageRank to mitigate the over-smoothing problem in long-range message passing and leverages GAT to capture complex data dependencies.
Compared with the naive $L$-step message passing, which has a space complexity of $O(L)$ for its optimization, ALS employs implicit differentiation to achieve $O(1)$ memory footprint and three acceleration techniques to reduce up to 89.51\% computation time.
Extensive experiments validate ALS's robustness and state-of-the-art performance on homophilic graphs, heterophilic graphs, and long-range graph benchmarks, with strong baselines including recently studied Graph Transformers and Graph Mambas.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: PageRank, implicit graph neural networks
Submission Number: 623
Loading