Avoiding Spurious Correlations: Bridging Theory and PracticeDownload PDF

09 Oct 2021, 14:49 (edited 02 Dec 2021)NeurIPS 2021 Workshop DistShift PosterReaders: Everyone
  • Keywords: out-of-distribution shifts, OOD generalization, failure modes
  • TL;DR: We characterize existing solutions to improving OOD generalization based on the skews induced by spurious correlation that they address, and empirically verify their effectiveness on several widely used OOD-shift datasets.
  • Abstract: Distribution shifts in the wild jeopardize the performance of machine learning models as they tend to pick up spurious correlations during training. Recent work (Nagarajan et al., 2020) has characterized two specific failure modes of out-of-distribution (OOD) generalization, and we extend this theoretical framework by interpreting existing algorithms as solutions to these failure modes. We then evaluate them on different image classification datasets, and in the process surface two issues that are central to existing robustness techniques. For the algorithms that require access to group information, we demonstrate how the existing annotations included in standard OOD benchmarks are unable to fully capture the spurious correlations present. For methods that don't rely on group annotations during training, the validation set they utilize for model selection carries assumptions that are not realistic in real-world settings. This leads us to explore how the choice of distribution shifts represented by validation data would affect the effectiveness of different OOD robustness algorithms.
1 Reply