Abstract: Conformal prediction is an emerging technique for uncertainty quantification that constructs prediction sets guaranteed to contain the true label with a predefined probability. Recent work develops online conformal prediction methods that adaptively construct prediction sets to accommodate distribution shifts. However, existing algorithms typically assume *perfect label accuracy* which rarely holds in practice. In this work, we investigate the robustness of online conformal prediction under uniform label noise with a known noise rate, in both constant and dynamic learning rate schedules. We show that label noise causes a persistent gap between the actual mis-coverage rate and the desired rate $\alpha$, leading to either overestimated or underestimated coverage guarantees. To address this issue, we propose *Noise Robust Online Conformal Prediction* (dubbed NR-OCP) by updating the threshold with a novel *robust pinball loss*, which provides an unbiased estimate of clean pinball loss without requiring ground-truth labels. Our theoretical analysis shows that NR-OCP eliminates the coverage gap in both constant and dynamic learning rate schedules, achieving a convergence rate of $\mathcal{O}(T^{-1/2})$ for both empirical and expected coverage errors (i.e., absolute deviation of the mis-coverage rate from the target level $\alpha$) under uniform label noise. Experiments show that our method demonstrates superior performance over the baseline by achieving both precise coverage and improved efficiency.
Primary Area: General Machine Learning->Everything Else
Keywords: Conformal Prediction, Online Learning, Label Noise
Submission Number: 3711
Loading