Hop-based Heterogeneous Graph Transformer

Published: 01 Jan 2024, Last Modified: 06 Feb 2025ECAI 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The Graph Transformer (GT) has shown significant ability in processing graph-structured data, addressing limitations in graph neural networks, such as over-smoothing and over-squashing. However, the implementation of GT in real-world heterogeneous graphs (HGs) with complex topology continues to present numerous challenges. Firstly, a challenge arises in designing a tokenizer that is compatible with heterogeneity. Secondly, the complexity of the transformer hampers the acquisition of high-order neighbor information in HGs. In this paper, we propose a novel Hop-based Heterogeneous Graph Transformer (H2Gormer) framework, paving a promising path for HGs to benefit from the capabilities of Transformers. We propose a Heterogeneous Hop-based Token Generation module to obtain high-order information in a flexible way. Specifically, to enrich the fine-grained heterogeneous semantics of each token, we propose a tailored multi-relational encoder to encode the hop-based neighbors. In this way, the resulting token embeddings are input to the Hop-based Transformer to obtain node representations, which are then combined with position embeddings to obtain the final encoding. Extensive experiments on four datasets are conducted to demonstrate the effectiveness of H2Gormer.
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview