Graph prototypical contrastive learning

Published: 01 Jan 2022, Last Modified: 14 Nov 2024Inf. Sci. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Unsupervised graph representation learning methods based on contrastive learning have drawn increasing attention and achieved promising performance. Most of these methods only model instance-level feature similarity while ignoring the underlying semantic structure of the whole data. In this paper, we propose a Graph Prototypical Contrastive Learning (GPCL) framework for unsupervised graph representation learning. Besides modeling instance-level feature similarity, GPCL explores the underlying semantic structure of the whole data. Specifically, we introduce an instance-prototype contrastive objective to learn representations that are invariant to intra-class variance and discriminative to inter-class variance. Meanwhile, a prototype-prototype contrastive objective is proposed to encourage clustering consistency between instances in the same cluster and their augmentations. To optimize the model, we formulate GPCL as an online Expectation–Maximization framework. We iteratively perform E-step to estimate the posterior probability of prototype assignments through online clustering and M-step to optimize model parameters through graph prototypical contrastive learning. We evaluate GPCL on various graph benchmarks, and the experimental results verify the superiority of our method.
Loading