Efficient Logistic Regression with Mixture of Sigmoids

Published: 03 Feb 2026, Last Modified: 23 Apr 2026AISTATS 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We make Exponential Weights Algorithm (EWA) for online Logistic Regression both regret-optimal and computationally practical via MALA, revealing a large-B geometry that links to SVMs and yields margin-based guarantees.
Abstract: This paper studies the Exponential Weights (EW) algorithm with an isotropic Gaussian prior for online logistic regression. We show that the near-optimal worst-case regret bound $O(d\log(Bn))$ for EW, established by \citet{kakade_ng_bayesianalg} against the best linear predictor of norm at most $B$, can be achieved with total worst-case computational complexity $\tilde O(B^3 n^5)$. This substantially improves on the $O(B^{18}n^{37})$ complexity of prior work achieving the same guarantee \citep{foster2018logistic}. Beyond efficiency, we analyze the large-$B$ regime under linear separability: after rescaling by $B$, the EW posterior converges as $B\to\infty$ to a standard Gaussian truncated to the version cone. Accordingly, the predictor converges to a solid-angle vote over separating directions and, on every fixed-margin slice of this cone, the mode of the corresponding truncated Gaussian is aligned with the hard-margin SVM direction. Using this geometry, we derive non-asymptotic regret bounds showing that once $B$ exceeds a margin-dependent threshold, the regret becomes independent of $B$ and grows only logarithmically with the inverse margin. Overall, our results show that EW can be both computationally tractable and geometrically adaptive in online classification.
Submission Number: 906
Loading