Convex optimization with $p$-norm oracles

Published: 18 Dec 2025, Last Modified: 21 Feb 2026ALT 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Convex Optimization, Proximal Point Methods, Acceleration
TL;DR: We give a general proximal point algorithm that gives improved rates for convex optimization problems, which we show are optimal.
Abstract: In recent years, there have been significant advances in efficiently solving $\ell_s$-regression using linear system solvers and $\ell_2$-regression [Adil-Kyng-Peng-Sachdeva, J. ACM'24]. Would efficient smoothed $\ell_p$-norm solvers lead to even faster rates for solving $\ell_s$-regression when $2 \leq p < s$? In this paper, we give an affirmative answer to this question and show how to solve $\ell_s$-regression using $\tilde{O}(n^{\frac{\nu}{1+\nu}})$ iterations of solving smoothed $\ell_p$ regression problems, where $\nu := \frac{1}{p} - \frac{1}{s}$. To obtain this result, we provide improved accelerated rates for convex optimization problems when given access to an _$\ell_p^s(\lambda)$-proximal oracle_, which, for a point $c$, returns the solution of the regularized problem $\min_{x} f(x) + \lambda ||x-c||_p^s$. Additionally, we show that these rates for the $\ell_p^s(\lambda)$-proximal oracle are optimal for algorithms that query in the span of the outputs of the oracle, and we further apply our techniques to settings of high-order and quasi-self-concordant optimization.
Submission Number: 104
Loading