Isotropic Noise in Stochastic and Quantum Convex Optimization

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: stochastic convex optimization, quantum convex optimization, cutting plane methods
TL;DR: We introduce a new gradient noise model for stochastic convex optimization and apply it to achieve state-of-the-art rates in both quantum and non-quantum settings.
Abstract: We consider the problem of minimizing a $d$-dimensional Lipschitz convex function using a stochastic gradient oracle. We introduce and motivate a setting where the noise of the stochastic gradient is isotropic in that it is bounded in every direction with high probability. We then develop an algorithm for this setting which improves upon prior results by a factor of $d$ in certain regimes, and as a corollary, achieves a new state-of-the-art complexity for sub-exponential noise. We give matching lower bounds (up to polylogarithmic factors) for both results. Additionally, we develop an efficient quantum isotropifier, a quantum algorithm which converts a variance-bounded quantum sampling oracle into one that outputs an unbiased estimate with isotropic error. Combining our results, we obtain improved dimension-dependent rates for quantum stochastic convex optimization.
Primary Area: Optimization (e.g., convex and non-convex, stochastic, robust)
Submission Number: 19585
Loading