Topology-Informed Graph Transformer

Published: 17 Jun 2024, Last Modified: 21 Aug 2024ICML 2024 Workshop GRaMEveryoneRevisionsBibTeXCC BY 4.0
Track: Proceedings
Keywords: Graph Neural Network, Topology, Graph Classification, Transformer, Positional embedding
Abstract: Transformers, through their self-attention mechanisms, have revolutionized performance in Natural Language Processing and Vision. Recently, there has been increasing interest in integrating Transformers with Graph Neural Networks (GNNs) to enhance analyzing geometric properties of graphs by employing global attention mechanisms. A key challenge in improving graph transformers is enhancing their ability to distinguish between isomorphic graphs, which can potentially boost their predictive performance. To address this challenge, we introduce 'Topology-Informed Graph Transformer (TIGT)', a novel transformer enhancing both discriminative power in detecting graph isomorphisms and the overall performance of Graph Transformers. TIGT consists of four components: (1) a topological positional embedding layer using non-isomorphic universal covers based on cyclic subgraphs of graphs to ensure unique graph representation, (2) a dual-path message-passing layer to explicitly encode topological characteristics throughout the encoder layers, (3) a global attention mechanism, and (4) a graph information layer to recalibrate channel-wise graph features for improved feature representation. TIGT outperforms previous Graph Transformers in classifying synthetic dataset aimed at distinguishing isomorphism classes of graphs. Additionally, mathematical analysis and empirical evaluations highlight our model's competitive edge over state-of-the-art Graph Transformers across various benchmark datasets.
Submission Number: 4
Loading