Efficient Logistic Regression with Mixture of Sigmoids

Published: 03 Feb 2026, Last Modified: 03 Feb 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We make Exponential Weights Algorithm (EWA) for online Logistic Regression both regret-optimal and computationally practical via MALA, revealing a large-B geometry that links to SVMs and yields margin-based guarantees.
Abstract: This paper explores the Exponential Weights Algorithm (EWA) with an isotropic Gaussian prior for online Logistic Regression. Our analysis establishes a worst-case regret bound of order $O\left(d \log(Bn)\right)$ against the best linear predictor of norm at most $B$, together with a practical implementation whose total runtime is $\tilde O(B^3 n^5)$ -- a substantial improvement over the $O(B^{18} n^{37})$ complexity of prior work (Foster et al., 2018) that achieved the same rate as Kakade and Ng (2005). Beyond efficiency, our analysis also reveals new geometric insights in the regime of linearly separable data. As the comparator norm bound $B\rightarrow\infty$, we show EWA's predictions converge to a solid-angle vote over separating directions. Indeed, this voting classifier is weighted by a truncated Gaussian distribution on the cone of directions that separates the data. Moreover, on any fixed-margin slice, the mode of such a distribution has the same direction as the hard-margin SVM solution (up to scale). Non-asymptotically, we prove that for a sufficiently large $B$, the regret becomes independent of $B$ and scales logarithmically with the margin $\gamma$. This yields a rate of $O(d\log(n))$ for well-specified logistic models with Gaussian design.
Submission Number: 906
Loading