Abstract: We consider minimizing a smooth function subject to an equality constraint. We analyze a greedy 2-coordinate update algorithm, and prove that greedy coordinate selection leads to faster convergence than random selection (under the Polyak-\L{}ojasiewicz assumption). Our simple analysis exploits am equivalence between the greedy 2-coordinate update and equality-constrained steepest descent in the L1-norm. Unlike previous 2-coordinate analyses, our convergence rate is dimension independent.
0 Replies
Loading