Revisiting Agnostic Boosting

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Weak-to-Strong Learning, Agnostic Learning, Sample Complexity, Margin-based Analysis, Boosting
TL;DR: We establish the sample complexity of agnostic boosting up to logarithmic factors by providing novel upper and lower bounds.
Abstract: Boosting is a key method in statistical learning, allowing for converting weak learners into strong ones. While well studied in the realizable case, the statistical properties of weak-to-strong learning remain less understood in the agnostic setting, where there are no assumptions on the distribution of the labels. In this work, we propose a new agnostic boosting algorithm with substantially improved sample complexity compared to prior works under very general assumptions. Our approach is based on a reduction to the realizable case, followed by a margin-based filtering of high-quality hypotheses. Furthermore, we show a nearly-matching lower bound, settling the sample complexity of agnostic boosting up to logarithmic factors.
Primary Area: Theory (e.g., control theory, learning theory, algorithmic game theory)
Submission Number: 12679
Loading