No Double Descent in Self-Supervised LearningDownload PDF

01 Mar 2023 (modified: 29 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: double descent, self-supervised learning
TL;DR: We present evidence from experiments with an autoencoder and the literature that double descent does not occur in self-supervised settings.
Abstract: Most investigations into double descent have focused on supervised models while the few works studying self-supervised settings find a surprising lack of the phenomenon. These results imply that double descent may not exist in self-supervised models. We show this empirically in two additional previously unstudied settings using a standard and linear autoencoder. We observe the test loss has either a classical U-shape or monotonically decreases, rather than exhibiting the double-descent curve. We hope that further work on this will help elucidate the theoretical underpinnings of this phenomenon.
5 Replies

Loading