Robust Online Learning

Published: 18 Dec 2025, Last Modified: 02 Mar 2026ALT 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Robustness, Online learning, Littlestone dimension
Abstract: We study the problem of learning robust classifiers where the classifier will receive a perturbed input and the clean data and its label are also adversarially chosen. We formulate this problem as an online learning problem and consider both the realizable and agnostic learnability of hypothesis classes. We define a new dimension of classes and show it controls the mistake bounds in the realizable setting and the regret bounds in the agnostic setting. In contrast to the dimension that characterizes learnability in the PAC setting, our dimension is rather simple and resembles the Littlestone dimension. We generalize our dimension to multiclass hypothesis classes and prove similar results in the realizable case. Finally, we study the case where the learner does not know the set of allowed perturbations for each point and only has some prior on them.
Submission Number: 44
Loading