Keywords: graphs, graph neural networks, GNNs, graph transformers, attention, graph clustering, community detection
Abstract: Message passing neural networks have recently become the most popular approach to graph machine learning tasks, however, their receptive field is limited by the number of message-passing layers. To increase the receptive field, using graph transformers with global attention has been proposed, however, global attention does not take into account the graph topology and thus lacks graph-structure-based inductive biases which are typically very important for graph machine learning tasks. In this work, we propose an alternative approach: cluster attention (CLATT). We divide graph nodes into clusters with off-the-shelf graph community detection algorithms and let each node attend to all other nodes in each cluster. CLATT provides large receptive fields while still having strong graph-structure-based inductive biases. We show that augmenting message-passing neural networks or graph transformers with CLATT significantly improves their performance on a wide range of graph datasets including datasets from the recently introduced GraphLand benchmark representing real-world applications of graph machine learning.
Primary Area: learning on graphs and other geometries & topologies
Submission Number: 4573
Loading