Sampling Informative Positives Pairs in Contrastive LearningDownload PDF

Published: 21 May 2023, Last Modified: 09 Sept 2023SampTA 2023 PaperReaders: Everyone
Abstract: Contrastive Learning is a paradigm for learning representation functions that recover useful similarity structure in data based on samples of positive (similar) and negative (dissimilar) instances. Typically, positive instances are sampled by randomly perturbing an anchor point using some form of data augmentation, while negative instances are sampled independently from the same distribution as the anchor points. The goal is to learn a representation function that places each anchor near its positive instances and far from its negative instances. However, not all randomly sampled positive instances are equally effective in learning a representation function that captures useful structure in the data. We consider a setting where class structure in the observed data derives from analogous structure in an unobserved latent space. We propose active sampling approaches for positive instances and investigate their role in effectively learning representation functions which recover the class structure in the underlying latent space.
Submission Type: Full Paper
Supplementary Materials: pdf
0 Replies

Loading