Keywords: convex optimization, lower bound complexity, proximal incremental first-order oracle
Abstract: This paper studies the lower bound complexity for the optimization problem whose objective function is the average of $n$ individual smooth convex functions. We consider the algorithm which gets access to gradient and proximal oracle for each individual component.
For the strongly-convex case, we prove such an algorithm can not reach an $\eps$-suboptimal point in fewer than $\Omega((n+\sqrt{\kappa n})\log(1/\eps))$ iterations, where $\kappa$ is the condition number of the objective function. This lower bound is tighter than previous results and perfectly matches the upper bound of the existing proximal incremental first-order oracle algorithm Point-SAGA.
We develop a novel construction to show the above result, which partitions the tridiagonal matrix of classical examples into $n$ groups to make the problem difficult enough to stochastic algorithms.
This construction is friendly to the analysis of proximal oracle and also could be used in general convex and average smooth cases naturally.
Original Pdf: pdf
10 Replies
Loading