Agreement-on-the-line: Predicting the Performance of Neural Networks under Distribution ShiftDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 22 Jan 2023, 07:30NeurIPS 2022 AcceptReaders: Everyone
Keywords: generalization, out-of-distribution generalization, robustness
TL;DR: Estimating out-of-distribution (OOD) performance is hard because labeled data is expensive. In our work, we show that unlabeled data can be leveraged to predict OOD performance using models’ agreement.
Abstract: Recently, Miller et al. showed that a model's in-distribution (ID) accuracy has a strong linear correlation with its out-of-distribution (OOD) accuracy, on several OOD benchmarks, a phenomenon they dubbed ``accuracy-on-the-line''. While a useful tool for model selection (i.e., the model most likely to perform the best OOD is the one with highest ID accuracy), this fact does not help to estimate the actual OOD performance of models without access to a labeled OOD validation set. In this paper, we show a similar surprising phenomena also holds for the agreement between pairs of neural network classifiers: whenever accuracy-on-the-line holds, we observe that the OOD agreement between the predictions of any two pairs of neural networks (with potentially different architectures) also observes a strong linear correlation with their ID agreement. Furthermore, we observe that the slope and bias of OOD vs ID agreement closely matches that of OOD vs ID accuracy. This phenomenon which we call agreement-on-the-line, has important practical applications: without any labeled data, we can predict the OOD accuracy of classifiers, since OOD agreement can be estimated with just unlabeled data. Our prediction algorithm outperforms previous methods both in shifts where agreement-on-the-line holds and, surprisingly, when accuracy is not on the line. This phenomenon also provides new insights into neural networks: unlike accuracy-on-the-line, agreement-on-the-line only appears to hold for neural network classifiers.
Supplementary Material: pdf
20 Replies