Learning Weakly-supervised Contrastive RepresentationsDownload PDF

Published: 28 Jan 2022, Last Modified: 22 Oct 2023ICLR 2022 PosterReaders: Everyone
Keywords: Self-supervised Learning, Weakly Supervised Learning, Learning with Auxiliary Information, Clustering-based Representation Learning
Abstract: We argue that a form of the valuable information provided by the auxiliary information is its implied data clustering information. For instance, considering hashtags as auxiliary information, we can hypothesize that an Instagram image will be semantically more similar with the same hashtags. With this intuition, we present a two-stage weakly-supervised contrastive learning approach. The first stage is to cluster data according to its auxiliary information. The second stage is to learn similar representations within the same cluster and dissimilar representations for data from different clusters. Our empirical experiments suggest the following three contributions. First, compared to conventional self-supervised representations, the auxiliary-information-infused representations bring the performance closer to the supervised representations, which use direct downstream labels as supervision signals. Second, our approach performs the best in most cases, when comparing our approach with other baseline representation learning methods that also leverage auxiliary data information. Third, we show that our approach also works well with unsupervised constructed clusters (e.g., no auxiliary information), resulting in a strong unsupervised representation learning approach.
One-sentence Summary: We present a weakly-supervised contrastive learning framework that considers auxiliary information (additional sources of information from data).
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2202.06670/code)
21 Replies

Loading