A Near-Optimal Single-Loop Stochastic Algorithm for Convex Finite-Sum Coupled Compositional Optimization
Abstract: This paper studies a class of convex Finite-sum Coupled Compositional Optimization (cFCCO) problems with applications including group distributionally robust optimization (GDRO) and learning with imbalanced data. To better address these problems, we introduce an efficient single-loop primal-dual block-coordinate stochastic algorithm called ALEXR. The algorithm employs block-coordinate stochastic mirror ascent with extrapolation for the dual variable and stochastic proximal gradient descent updates for the primal variable. We establish the convergence rates of ALEXR in both convex and strongly convex cases under smoothness and non-smoothness conditions of involved functions, which not only improve the best rates in previous works on smooth cFCCO problems but also expand the realm of cFCCO for solving more challenging non-smooth problems such as the dual form of GDRO. Finally, we derive lower complexity bounds, demonstrating the (near-)optimality of ALEXR within a broad class of stochastic algorithms for cFCCO. Experimental results on GDRO and partial Area Under the ROC Curve (pAUC) maximization demonstrate the promising performance of our algorithm.
Lay Summary: Many real-world machine-learning tasks can be formulated as ``finite-sum coupled compositional optimization'' such as maximizing the model performance on a highly imbalanced dataset. What we contributed: A new algorithm (ALEXR) with stronger convergence guarantees than previous algorithms under the same or weaker assumptions; We also show the optimality of the proposed algorithm; We validate the effectiveness of our algorithm on real-world datasets.
Primary Area: Optimization->Stochastic
Keywords: Finite-Sum Coupled Compositional Optimization, Primal-Dual Algorithm, Block-Coordinate Algorithm
Submission Number: 8419
Loading