Analyzing and Improving Greedy 2-Coordinate Updates For Equality-Constrained Optimization via Steepest Descent in the 1-Norm
Keywords: Coordinate descent, SVM, LIBSVM, Steepest descent, convex optimization
Abstract: We first consider minimizing a smooth function subject to a summation constraint over its variables.
By exploiting a connection between the greedy 2-coordinate update for this problem and equality-constrained steepest descent in the 1-norm, we give a convergence rate for greedy selection that is faster than random selection and independent of the problem dimension $n$ (under a proximal Polyak-Lojasiewicz assumption). We then consider minimizing with both a summation constraint and bound constraints, as they arise in the support vector machine dual problem. Existing greedy rules for this setting either only guarantee trivial progress or require $O(n^2)$ time to compute. We show that bound- and summation-constrained steepest descent in the L1-norm guarantees more progress per iteration than previous rules and can be computed in only $O(n \log n)$.
Supplementary Material: pdf
Submission Number: 11920
Loading