Pseudo-Calibration: Improving Predictive Uncertainty Estimation in Domain Adaptation

Published: 28 Oct 2023, Last Modified: 02 Apr 2024DistShift 2023 PosterEveryoneRevisionsBibTeX
Keywords: Domain Adaptation; Predictive Uncertainty; Model Calibration
TL;DR: We introduce Pseudo-Calibration, a novel and versatile post-hoc framework for calibrating predictive uncertainty in unsupervised domain adaptation.
Abstract: Unsupervised domain adaptation (UDA) improves model accuracy in an unlabeled target domain using a labeled source domain. However, UDA models often lack calibrated predictive uncertainty on target data, posing risks in safety-critical applications. In this paper, we address this under-explored challenge with Pseudo-Calibration (PseudoCal), a novel post-hoc calibration framework. In contrast to prior approaches, we consider UDA calibration as a target-domain specific unsupervised problem rather than a \emph{covariate shift} problem across domains. With a synthesized labeled pseudo-target set that captures the structure of the real target, we turn the unsupervised calibration problem into a supervised one, readily solvable with \emph{temperature scaling}. Extensive empirical evaluation across 5 diverse UDA scenarios involving 10 UDA methods, along with unsupervised fine-tuning of foundation models such as CLIP, consistently demonstrates the superior performance of PseudoCal over alternative calibration methods. Code is available at \url{}.
Submission Number: 30