Improving Group-based Robustness and Calibration via Ordered Risk and Confidence RegularizationDownload PDF

28 May 2022, 15:03 (modified: 21 Jul 2022, 01:30)SCIS 2022 PosterReaders: Everyone
Keywords: robustness, dataset bias, spurious correlation
TL;DR: To fight with spurious correlation, we introduce a new mechanism, ORC, which relatively regularizes the risks and the confidences of the groups in the training dataset.
Abstract: Neural network trained via empirical risk minimization achieves high accuracy on average but low accuracy on certain groups, especially when there is a spurious correlation. To construct the unbiased model from spurious correlation, we build a hypothesis that the inference to the samples without spurious correlation should take relative precedence over the inference to the spuriously biased samples. Based on the hypothesis, we propose the relative regularization to induce the training risk of each group to follow the specific order, which is sorted according to the degree of spurious correlation for each group. In addition, we introduce the ordering regularization based on the predictive confidence of each group to improve the model calibration, where other robust models still suffer from large calibration errors. These result in our complete algorithm, Ordered Risk and Confidence regularization (ORC). Our experiments demonstrate that ORC improves both the group robustness and calibration performances against the various types of spurious correlation in both synthetic and real-world datasets.
Confirmation: Yes
0 Replies