Active Learning by Acquiring Contrastive Examples

11 Nov 2021 (modified: 11 Nov 2021)OpenReview Archive Direct UploadReaders: Everyone
Abstract: Common acquisition functions for active learn- ing use either uncertainty or diversity sam- pling, aiming to select difficult and diverse data points from the pool of unlabeled data, re- spectively. In this work, leveraging the best of both worlds, we propose an acquisition function that opts for selecting contrastive ex- amples, i.e. data points that are similar in the model feature space and yet the model outputs maximally different predictive likeli- hoods. We compare our approach, CAL (Con- trastive Active Learning), with a diverse set of acquisition functions in four natural language understanding tasks and seven datasets. Our experiments show that CAL performs consis- tently better or equal than the best performing baseline across all tasks, on both in-domain and out-of-domain data. We also conduct an extensive ablation study of our method and we further analyze all actively acquired datasets showing that CAL achieves a better trade-off between uncertainty and diversity compared to other strategies
0 Replies

Loading