Time- and Label-efficient Active Learning by Diversity and Uncertainty of Probabilities

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Active Learning, Deep Active Learning, Fast, Label-Efficient
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We propose FALCUN, a novel deep batch active learning method that is label- and time-efficient by exploiting diversity and uncertainty in the probability space.
Abstract: We propose FALCUN, a novel deep batch active learning method that is label- and time-efficient. Our proposed acquisition uses a natural, self-adjusting balance of uncertainty and diversity: It slowly transitions from emphasizing uncertain instances at the decision boundary to emphasizing batch diversity. In contrast, established deep active learning methods often have a fixed weighting of uncertainty and diversity. Moreover, most methods demand intensive search through a deep neural network's high-dimensional latent embedding space. This leads to high acquisition times during which experts are idle as they wait for the next batch to label. We overcome this structural problem by exclusively operating on the low-dimensional probability space, yielding much faster acquisition times. In extensive experiments, we show FALCUNs suitability for diverse use cases, including image and tabular data. Compared to state-of-the-art methods like BADGE, CLUE, and AlfaMix, FALCUN consistently excels in quality and speed: while FALCUN is among the fastest methods, it has the highest average label efficiency.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7313
Loading