Conformal Uncertainty Indicator for Continual Test-Time Adaptation

ICLR 2026 Conference Submission12614 Authors

18 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Test-time adaptation, Uncertainty measurement, continual learning
TL;DR: This paper proposes a Conformal Uncertainty Indicator (CUI) for CTTA, leveraging Conformal Prediction (CP) to generate prediction sets that include the true label with a specified coverage probability.
Abstract: Continual Test-Time Adaptation (CTTA) enables models to adapt to sequential domain shifts during testing, but reliance on pseudo-labels makes them prone to error accumulation. Reliable uncertainty estimation is thus critical. We study this problem under the calibration-aided CTTA setting, where a small calibration buffer from the source domain is available as reference. We propose the Conformal Uncertainty Indicator (CUI), a plug-and-play method that leverages Conformal Prediction (CP) with calibration data. Unlike standard CP, which suffers from a coverage gap under domain shifts, CUI jointly measures model shift and data shift to adjust conformal quantiles and restore coverage. The resulting prediction set size provides a reliable indicator of test-time uncertainty. Building on this, we introduce a CUI-guided adaptation strategy that updates models only on confident samples. Experiments on three benchmarks show that CUI achieves accurate uncertainty estimation and improves the robustness of multiple CTTA baselines.
Supplementary Material: zip
Primary Area: transfer learning, meta learning, and lifelong learning
Submission Number: 12614
Loading