Keywords: distribution shifts, Shapley attribution, causal graph
TL;DR: We propose a method to attribute model performance changes to distribution shifts in causal mechanisms.
Abstract: Performance of machine learning models may differ significantly in novel environments compared to during training due to shifts in the underlying data distribution. Attributing performance changes to specific data shifts is critical for identifying sources of model failures and designing stable models. In this work, we design a novel method for attributing performance difference between environments to shifts in the underlying causal mechanisms. To this end, we construct a cooperative game where the contribution of each mechanism is quantified as their Shapley value. We demonstrate the ability of the method to identify sources of spurious correlation and attribute performance drop to shifts in label and/or feature distributions on synthetic and real-world datasets.
Confirmation: Yes
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/why-did-the-model-fail-attributing-model/code)
0 Replies
Loading