HGOT: Self-supervised Heterogeneous Graph Neural Network with Optimal Transport

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Heterogeneous Graph Neural Networks (HGNNs), have demonstrated excellent capabilities in processing heterogeneous information networks. Self-supervised learning on heterogeneous graphs, especially contrastive self-supervised strategy, shows great potential when there are no labels. However, this approach requires the use of carefully designed graph augmentation strategies and the selection of positive and negative samples. Determining the exact level of similarity between sample pairs is non-trivial.To solve this problem, we propose a novel self-supervised Heterogeneous graph neural network with Optimal Transport (HGOT) method which is designed to facilitate self-supervised learning for heterogeneous graphs without graph augmentation strategies. Different from traditional contrastive self-supervised learning, HGOT employs the optimal transport mechanism to relieve the laborious sampling process of positive and negative samples. Specifically, we design an aggregating view (central view) to integrate the semantic information contained in the views represented by different meta-paths (branch views). Then, we introduce an optimal transport plan to identify the transport relationship between the semantics contained in the branch view and the central view. This allows the optimal transport plan between graphs to align with the representations, forcing the encoder to learn node representations that are more similar to the graph space and of higher quality. Extensive experiments on four real-world datasets demonstrate that our proposed HGOT model can achieve state-of-the-art performance on various downstream tasks. In particular, in the node classification task, HGOT achieves an average of more than 6\% improvement in accuracy compared with state-of-the-art methods.
Lay Summary: In this paper, we propose a novel self-supervised heterogeneous graph neural networks with optimal transport, named HGOT. The attention mechanism is employed to obtain an aggregated view which can integrate the semantic information from different meta-paths. Then, HGOT exploits the optimal transport theory to discover the optimal transport plan between the meta-path view and the aggregated view. By aligning the transport plans between graph space and representation sapce, HGOT enforces the backbone model to learn node representations that precisely preserve the matching relationships. Extensive experiments conducted on multiple datasets demonstrate the state-of-the-art performance of the proposed HGOT.
Primary Area: Deep Learning->Graph Neural Networks
Keywords: Heterogeneous graph, Self-supervised learning, Optimal transport
Submission Number: 9061
Loading