Prototypical Contrastive Learning of Unsupervised RepresentationsDownload PDF

28 Sept 2020, 15:52 (edited 10 Feb 2022)ICLR 2021 PosterReaders: Everyone
  • Keywords: self-supervised learning, unsupervised learning, representation learning, contrastive learning
  • Abstract: This paper presents Prototypical Contrastive Learning (PCL), an unsupervised representation learning method that bridges contrastive learning with clustering. PCL not only learns low-level features for the task of instance discrimination, but more importantly, it implicitly encodes semantic structures of the data into the learned embedding space. Specifically, we introduce prototypes as latent variables to help find the maximum-likelihood estimation of the network parameters in an Expectation-Maximization framework. We iteratively perform E-step as finding the distribution of prototypes via clustering and M-step as optimizing the network via contrastive learning. We propose ProtoNCE loss, a generalized version of the InfoNCE loss for contrastive learning, which encourages representations to be closer to their assigned prototypes. PCL outperforms state-of-the-art instance-wise contrastive learning methods on multiple benchmarks with substantial improvement in low-resource transfer learning. Code and pretrained models are available at https://github.com/salesforce/PCL.
  • One-sentence Summary: We propose an unsupervised representation learning method that bridges contrastive learning with clustering in an EM framework.
  • Supplementary Material: zip
  • Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
  • Code: [![github](/images/github_icon.svg) salesforce/PCL](https://github.com/salesforce/PCL) + [![Papers with Code](/images/pwc_icon.svg) 2 community implementations](https://paperswithcode.com/paper/?openreview=KmykpuSrjcq)
  • Data: [ImageNet](https://paperswithcode.com/dataset/imagenet), [Places205](https://paperswithcode.com/dataset/places205)
9 Replies

Loading