Causally motivated multi-shortcut identification and removalDownload PDF

Published: 21 Jul 2022, Last Modified: 05 May 2023SCIS 2022 PosterReaders: Everyone
Keywords: shortcut learning, spurious correlations, causality
TL;DR: We develop a method to identify and remove multiple shortcuts leading to accurate models that are robust to distribution shifts
Abstract: For predictive models to provide reliable guidance in decision making processes, they are often required to be accurate and robust to distribution shift. Shortcut learning--where a model relies on spurious correlations or shortcuts to predict the target label--undermines the robustness property, leading to models with poor out-of-distribution accuracy despite good in-distribution performance. Existing work on shortcut learning either assumes that the set of possible shortcuts is known a priori or is discoverable using interprability methods such as saliency maps. Instead, we propose a two step approach to (1) efficiently identify relevant shortcuts, and (2) leverage the identified shortcuts to build models that are robust to distribution shifts. Our approach relies on having access to a (possibly) high dimensional set of auxiliary labels at training time, some of which correspond to possible shortcuts. We show both theoretically and empirically that our approach is able to identify a small sufficient set of shortcuts leading to more efficient predictors in finite samples.
Confirmation: Yes
0 Replies

Loading