Anchor-Based Conformal Prediction Under Noisy Annotations in Single-Cell Data

ICLR 2026 Conference Submission20743 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Conformal Prediction, Single-Cell Data, Noise Labels, Machine Learning
Abstract: Learning predictive models from noisy annotations is a challenge in modern machine learning, particularly in domains where labels are obtained from multiple imperfect annotators. In this work, we introduce an anchor-based conformal prediction framework that provides rigorous uncertainty guarantees even in the presence of label noise. Our method identifies pseudo-anchors by selecting samples with strong agreement across annotators, uses these anchors to train a base predictor, and calibrates top-k conformal sets to ensure valid coverage. This construction produces prediction sets that are both reliable and compact, while explicitly accounting for annotation disagreement. Our results show that anchor-guided conformal prediction attains coverage close to nominal targets while producing smaller prediction sets and maintaining robustness in the presence of noisy labels. Although evaluated on single-cell data, the framework more generally offers a principled way to integrate multiple noisy annotator signals with conformal prediction, enabling reliable uncertainty estimates under imperfect supervision. This enables reliable uncertainty estimates in settings where ground-truth labels are scarce, expensive to obtain, or inherently ambiguous, and highlights how conformal methods can be applied to more realistic and noisy supervision scenarios.
Primary Area: applications to physical sciences (physics, chemistry, biology, etc.)
Submission Number: 20743
Loading