Focus on the Common Good: Group Distributional Robustness FollowsDownload PDF

Published: 28 Jan 2022, Last Modified: 17 Sept 2023ICLR 2022 PosterReaders: Everyone
Keywords: sub-population shift, robust optimization, domain generalization
Abstract: We consider the problem of training a classification model with group annotated training data. Recent work has established that, if there is distribution shift across different groups, models trained using the standard empirical risk minimization (ERM) objective suffer from poor performance on minority groups and that group distributionally robust optimization (Group-DRO) objective is a better alternative. The starting point of this paper is the observation that though Group-DRO performs better than ERM on minority groups for some benchmark datasets, there are several other datasets where it performs much worse than ERM. Inspired by ideas from the closely related problem of domain generalization, this paper proposes a new and simple algorithm that explicitly encourages learning of features that are shared across various groups. The key insight behind our proposed algorithm is that while Group-DRO focuses on groups with worst regularized loss, focusing instead, on groups that enable better performance even on other groups, could lead to learning of shared/common features, thereby enhancing minority performance beyond what is achieved by Group-DRO. Empirically, we show that our proposed algorithm matches or achieves better performance compared to strong contemporary baselines including ERM and Group-DRO on standard benchmarks on both minority groups and across all groups. Theoretically, we show that the proposed algorithm is a descent method and finds first order stationary points of smooth nonconvex functions.
One-sentence Summary: We propose a new and simple algorithm for the sub-population shift problem that enables learning of shared features and performed consistently well over several standard, and real-world, benchmarks of the problem.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](
12 Replies