Keywords: Fairness, Distributionally Robust Optimization, Linear Regression, Multi-task Learning
TL;DR: We present a new algorithm for minimizing the maximum linear regression loss over multiple groups.
Abstract: We present an algorithm for the empirical group distributionally robust (GDR) least squares problem. Given $m$ groups, a parameter vector in $\mathbb{R}^d$, and stacked design matrices and responses $\mathbf{A}$ and $\mathbf{b}$, our algorithm obtains a $(1+\varepsilon)$-multiplicative optimal solution using $\widetilde{O}(\min(\mathsf{rank}(\mathbf{A}),m)^{1/3}\varepsilon^{-2/3})$ linear system solves of matrices of the form $\mathbf{A}^{\top}\mathbf{B}\mathbf{A}$ for block-diagonal $\mathbf{B}$. Our technical methods follow from a recent technique that relates the empirical GDR problem to a carefully chosen least squares problem and an application of ball-oracle acceleration. For moderate accuracy regimes, our algorithm improves over all known interior point methods and matches the state-of-the-art guarantees for the special case of $\ell_{\infty}$ regression.
Submission Number: 46
Loading