Source-Free Target Domain Confidence Calibration

26 Sept 2024 (modified: 05 Feb 2025)Submitted to ICLR 2025EveryoneRevisionsBibTeXCC BY 4.0
Keywords: confidence calibration, domain adaptation, source-free
TL;DR: We present a confidence calibration method for a source-free domain adaptation setup
Abstract:

In this study, we consider the setup of source-free domain adaptation and address the challenge of calibrating the confidence of a model adapted to the target domain using only unlabeled data. The primary challenge in addressing uncertainty calibration is the absence of labeled data which prevents computing the accuracy of the adapted network on the target domain. We address this by leveraging pseudo-labels generated from the source model’s predictions to estimate the true, unobserved accuracy. We demonstrate that, although the pseudo-labels are noisy, the network accuracy calculated using these pseudo-labels is similar to the accuracy obtained with the correct labels. We validate the effectiveness of our calibration approach by applying it to standard domain adaptation datasets and show that it achieves results comparable to, or even better than, previous calibration methods that relied on the availability of labeled source data.

Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 6363
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview