Necessary and Sufficient Hypothesis of Curvature: Understanding Connection Between Out-of-Distribution Generalization and Calibration

Published: 10 Mar 2023, Last Modified: 28 Apr 2023ICLR 2023 Workshop DG PosterEveryoneRevisions
Keywords: Out-of-Distribution Generalization, Calibration, Curveture, Sharpness-Aware Minimization
TL;DR: This study attempts to understand the connection between Out-of-Distribution Generalization and Calibration from the perspective of Curvature
Abstract: In this study, we address two significant issues that hinder the application of deep learning in real-world settings: Out-of-Distribution Generalization and Calibration. While both Out-of-Distribution Generalization and Calibration have been researched in different contexts, we propose a hypothesis that they can be considered through the lens of curvature. Our extensive experiments demonstrate that training with Sharpness-Aware Minimization, which achieves low curvature, results in well-calibrated models with high accuracy, even on Out-of-Distribution datasets. Finally, we provide theoretical analysis to show that low curvature models are well-calibrated.
Submission Number: 35
Loading