Smoothed Analysis of Sequential Probability Assignment

Published: 21 Sept 2023, Last Modified: 02 Nov 2023NeurIPS 2023 spotlightEveryoneRevisionsBibTeX
Keywords: Online learning, Log loss, Information theory, Smoothed Analysis, Beyond worst case analysis, Oracle Efficient Online Learning
TL;DR: We study sequential probability assignment in the smoothed adversary model, presenting both statistically and computationally efficient algorithms that achieve the rates depending on offline complexity measures.
Abstract: We initiate the study of smoothed analysis for the sequential probability assignment problem with contexts. We study information-theoretically optimal minmax rates as well as a framework for algorithmic reduction involving the maximum likelihood estimator oracle. Our approach establishes a general-purpose reduction from minimax rates for sequential probability assignment for smoothed adversaries to minimax rates for transductive learning. This leads to optimal (logarithmic) fast rates for parametric classes and classes with finite VC dimension. On the algorithmic front, we develop an algorithm that efficiently taps into the MLE oracle, for general classes of functions. We show that under general conditions this algorithmic approach yields sublinear regret.
Supplementary Material: pdf
Submission Number: 3828
Loading