The Close Relationship Between Contrastive Learning and Meta-LearningDownload PDF

29 Sept 2021, 00:33 (edited 24 Mar 2022)ICLR 2022 PosterReaders: Everyone
  • Keywords: meta-learning, contrastive learning, self-supervised learning
  • Abstract: Contrastive learning has recently taken off as a paradigm for learning from unlabeled data. In this paper, we discuss the close relationship between contrastive learning and meta-learning under a certain task distribution. We complement this observation by showing that established meta-learning methods, such as Prototypical Networks, achieve comparable performance to SimCLR when paired with this task distribution. This relationship can be leveraged by taking established techniques from meta-learning, such as task-based data augmentation, and showing that they benefit contrastive learning as well. These tricks also benefit state-of-the-art self-supervised learners without using negative pairs such as BYOL, which achieves 94.6\% accuracy on CIFAR-10 using a self-supervised ResNet-18 feature extractor trained with our meta-learning tricks. We conclude that existing advances designed for contrastive learning or meta-learning can be exploited to benefit the other, and it is better for contrastive learning researchers to take lessons from the meta-learning literature (and vice-versa) than to reinvent the wheel.
  • One-sentence Summary: We discuss the close relationship between contrastive learning and meta-learning, and we propose a meta-learning framework for self-supervised learning (SSL) along with meta-specific methods to improve contrastive learning performance for SSL.
  • Supplementary Material: zip
12 Replies