Self-Supervision is Not All You Need: In Defense of Semi-Supervised Learning

22 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Self-Supervised Learning, Semi-Supervised Learning, Learning with Limited Labels
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: When should labels be introduced in Limited Label Representation Learning ?
Abstract: Self-supervised (Self-SL) and Semi-supervised learning (Semi-SL) are two dominant approaches in limited label representation learning. Recent advances in Self-SL demonstrate its importance as a pretraining step to initialize the model with strong representations for virtually every supervised learning task. This "Self-SL pretraining followed by supervised finetuning" pipeline challenges the benefits of Semi-SL frameworks. This paper studies the advantages/disadvantages of Self-SL and Semi-SL frameworks under different conditions. At its core, this paper tries to answer the question "When to favor one over the other?". In particular, we explore how the choice of Self-SL versus Semi-SL framework affects performance on in-domain, near-domain and out-of-distribution data, robustness to image corruptions and adversarial attacks, cross-domain few-shot learning, and ability to learn from imbalanced data. Our extensive experiments demonstrate that in-domain performance and robustness to perturbations are the two biggest strengths of Semi-SL approaches, where they outperform Self-SL methods by huge margins, while also matching Self-supervised techniques on other evaluation settings.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 4322
Loading