Distributionally Robust Optimization via Ball Oracle AccelerationDownload PDF

Published: 31 Oct 2022, Last Modified: 11 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: convex optimization, distributionally robust optimization, theory, oracle complexity, monteiro-svaiter acceleration, accelerated methods, algorithm design, entropy regularization, multilevel monte-carlo
Abstract: We develop and analyze algorithms for distributionally robust optimization (DRO) of convex losses. In particular, we consider group-structured and bounded $f$-divergence uncertainty sets. Our approach relies on an accelerated method that queries a ball optimization oracle, i.e., a subroutine that minimizes the objective within a small ball around the query point. Our main contribution is efficient implementations of this oracle for DRO objectives. For DRO with $N$ non-smooth loss functions, the resulting algorithms find an $\epsilon$-accurate solution with $\widetilde{O}\left(N\epsilon^{-2/3} + \epsilon^{-2}\right)$ first-order oracle queries to individual loss functions. Compared to existing algorithms for this problem, we improve complexity by a factor of up to $\epsilon^{-4/3}$.
TL;DR: We develop and theoretically analyze algorithms for distributionally robust optimization with group-structured and bounded $f$-divergence uncertainty sets.
Supplementary Material: pdf
13 Replies

Loading