Keywords: Trustworthy Machine Learning, Conformal Prediction, Uncertainty Quantification
Abstract: Conformal prediction is an emerging technique for uncertainty quantification that constructs prediction sets guaranteed to contain the true label with a predefined probability.
Recent work develops online conformal prediction methods that adaptively construct prediction sets to accommodate distribution shifts.
However, existing algorithms typically assume *perfect label accuracy* which rarely holds in practice.
In this work, we investigate the robustness of online conformal prediction under uniform label noise with a known noise rate.
We show that label noise causes a persistent gap between the actual mis-coverage rate and the desired rate $\alpha$, leading to either overestimated or underestimated coverage guarantees.
To address this issue, we propose a novel loss function *robust pinball loss*, which provides an unbiased estimate of clean pinball loss without requiring ground-truth labels.
Theoretically, we demonstrate that robust pinball loss enables online conformal prediction to eliminate the coverage gap under uniform label noise, achieving a convergence rate of $\mathcal{O}(T^{-1/2})$ for both empirical and expected coverage errors (i.e., absolute deviation of the empirical and expected mis-coverage rate from the target level $\alpha$).
This loss offers a general solution to the uniform label noise, and is complementary to existing online conformal prediction methods.
Extensive experiments demonstrate that the proposed loss enhances the noise robustness of various online conformal prediction methods by achieving a precise coverage guarantee.
Supplementary Material: zip
Primary Area: General machine learning (supervised, unsupervised, online, active, etc.)
Submission Number: 10940
Loading