A Fused Gromov-Wasserstein Approach to Subgraph Contrastive Learning

TMLR Paper3177 Authors

13 Aug 2024 (modified: 20 Nov 2024)Under review for TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Self-supervised learning has become a key method for training deep learning models when labeled data is scarce or unavailable. While graph machine learning holds great promise across various domains, the design of effective pretext tasks for self-supervised graph representation learning remains challenging. Contrastive learning, a popular approach in graph self-supervised learning, leverages positive and negative pairs to compute a contrastive loss function. However, current graph contrastive learning methods often struggle to fully use structural patterns and node similarities. To address these issues, we present a new method called Fused Gromov-Wasserstein Subgraph Contrastive Learning (FOSSIL). Our method integrates node-level and subgraph-level contrastive learning, seamlessly combining a standard node-level contrastive loss with the Fused Gromov-Wasserstein distance. This combination helps our method capture both node features and graph structure together. Importantly, our approach works well with both homophilic and heterophilic graphs and can dynamically create views for generating positive and negative pairs. Through extensive experiments on benchmark graph datasets, we show that FOSSIL outperforms or achieves competitive performance compared to current state-of-the-art methods.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: - We add a formal definition of graph contrastive learning; - We update the equations of section 4.4 to make it more clear; - We write $\mathcal{L}_{ot}$ in just one equation.
Assigned Action Editor: ~Moshe_Eliasof1
Submission Number: 3177
Loading