What Contrastive Learning Learns Beyond Class-wise Features?Download PDF

Published: 04 Mar 2023, Last Modified: 16 May 2023ME-FoMo 2023 PosterReaders: Everyone
Keywords: contrastive learning, self-supervised learning
TL;DR: We analyze in detail the features learned by contrastive learning and discuss their impact on downstream generalization and transferability.
Abstract: In recent years, contrastive learning has achieved the performance that is comparable to supervised learning in representation learning. However, the transferability of different contrastive learning methods to downstream tasks often varies greatly. In this paper, we study the downstream generalization ability of two contrastive learning methods: SimCLR and Spectral Contrastive Learning (Spectral CL). We find that beyond class-wise features, contrastive learning also learns two types of features, which we call shared features and subclass features, which play an important role in model transferability. SimCLR learns more shared and subclass features than Spectral CL, resulting in better transferability. We theoretically and experimentally reveal the mechanism by which SimCLR can learn more diverse features than Spectral CL. Therefore, we propose a method called High-pass Spectral CL to improve the transferability and generalization of Spectral CL, which achieves better performance than SimCLR and Spectral CL.
0 Replies

Loading