Representing Long-Range Context for Graph Neural Networks with Global AttentionDownload PDF

Published: 09 Nov 2021, Last Modified: 25 Nov 2024NeurIPS 2021 PosterReaders: Everyone
Keywords: graph neural networks, transformers, long-range context, OpenGraphBenchmark
TL;DR: We generalize the Transformer structure to improve long-range context for graph neural networks to achieve a new SOTA on the OpenGraphBenchmark.
Abstract: Graph neural networks are powerful architectures for structured datasets. However, current methods struggle to represent long-range dependencies. Scaling the depth or width of GNNs is insufficient to broaden receptive fields as larger GNNs encounter optimization instabilities such as vanishing gradients and representation oversmoothing, while pooling-based approaches have yet to become as universally useful as in computer vision. In this work, we propose the use of Transformer-based self-attention to learn long-range pairwise relationships, with a novel “readout” mechanism to obtain a global graph embedding. Inspired by recent computer vision results that find position-invariant attention performant in learning long-range relationships, our method, which we call GraphTrans, applies a permutation-invariant Transformer module after a standard GNN module. This simple architecture leads to state-of-the-art results on several graph classification tasks, outperforming methods that explicitly encode graph structure. Our results suggest that purely-learning-based approaches without graph structure may be suitable for learning high-level, long-range relationships on graphs. Code for GraphTrans is available at https://github.com/ucbrise/graphtrans.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
Code: https://github.com/ucbrise/graphtrans
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/representing-long-range-context-for-graph/code)
13 Replies

Loading