Stochastic Batch Acquisition: A Simple Baseline for Deep Active Learning

Published: 19 Sept 2023, Last Modified: 19 Sept 2023Accepted by TMLREveryoneRevisionsBibTeX
Authors that are also TMLR Expert Reviewers: ~Andreas_Kirsch1
Abstract: We examine a simple stochastic strategy for adapting well-known single-point acquisition functions to allow batch active learning. Unlike acquiring the top-K points from the pool set, score- or rank-based sampling takes into account that acquisition scores change as new data are acquired. This simple strategy for adapting standard single-sample acquisition strategies can even perform just as well as compute-intensive state-of-the-art batch acquisition functions, like BatchBALD or BADGE while using orders of magnitude less compute. In addition to providing a practical option for machine learning practitioners, the surprising success of the proposed method in a wide range of experimental settings raises a difficult question for the field: when are these expensive batch acquisition methods pulling their weight?
Certifications: Expert Certification
Submission Length: Regular submission (no more than 12 pages of main content)
Code: https://github.com/baal-org/baal https://github.com/OATML/causal-bald https://github.com/BlackHC/active-bayesian-coresets https://github.com/BlackHC/active_learning_redux
Assigned Action Editor: ~Chicheng_Zhang1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 827
Loading