SABAL: Sparse Approximation-based Batch Active LearningDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: active learning, Bayesian active learning, batch active learning
Abstract: We propose a novel and general framework (i.e., SABAL) that formulates batch active learning as a sparse approximation problem. SABAL aims to find a weighted subset from the unlabeled data pool such that the corresponding training loss function approximates its full data pool counterpart. We realize the general framework as a sparsity-constrained discontinuous optimization problem that explicitly balances uncertainty and representation for large-scale applications, for which we propose both greedy and iterative hard thresholding schemes. The proposed method can adapt to various settings, including both Bayesian and non-Bayesian neural networks. Numerical experiments show that that SABAL achieves state-of-the-art performance across different settings with lower computational complexity.
One-sentence Summary: We propose a novel and general framework that formulates the batch active learning as a sparse approximation problem.
Supplementary Material: zip
12 Replies

Loading