Near-Polynomially Competitive Active Logistic Regression

Published: 22 Jan 2025, Last Modified: 07 Mar 2025AISTATS 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We give a polynomially competitive algorithm for active logistic regression.
Abstract: We address the problem of active logistic regression in the realizable setting. It is well known that active learning can require exponentially fewer label queries compared to passive learning, in some cases using $\log \frac{1}{\varepsilon}$ rather than $\mathrm{poly}(1/\varepsilon)$ samples to get error $\varepsilon$ larger than the optimum. We present the first algorithm that is polynomially competitive with the optimal algorithm on every input instance, up to factors polylogarithmic in the error and domain size. In particular, if any algorithm achieves sample complexity polylogarithmic in $\varepsilon$, so does ours. Our algorithm is based on efficient sampling and can be extended to learn more general class of functions. We further support our theoretical results with experiments demonstrating performance gains for logistic regression compared to existing active learning algorithms.
Submission Number: 784
Loading