Keywords: Graph Neural Networks, Subgraphs, Transformers, Graph Transformers
TL;DR: The paper introduces Subgraphormer, a model combining Subgraph GNNs and graph transformers, showcasing a big performance improvement in preliminary experiments.
Abstract: In the realm of Graph Neural Network (GNNs), two intriguing research directions have recently emerged: Subgraph GNNs and Graph Transformers. These approaches have distinct origins -- Subgraph GNNs aim to address the limitations of message passing, while Graph Transformers seek to build on the success of sequential transformers in language and vision tasks. In this paper, we propose a model that integrates both approaches, dubbed _Subgraphormer_, which combines the message passing and global aggregation schemes from Subgraph GNNs with attention mechanisms and positional and structural encodings, which are arguably the most important components in Graph Transformers. Our preliminary experimental results demonstrate significant performance improvements over both Subgraph GNNs and Graph Transformers.
Submission Number: 15
Loading