Conformal Risk Minimization with Variance Reduction

Published: 01 Jul 2025, Last Modified: 11 Jul 2025ICML 2025 R2-FM Workshop PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Uncertainty quantification, Conformal Prediciton, conformal risk minimization, length efficiency in conformal prediction, reliable machine learning
Abstract: Conformal prediction (CP) is a distribution-free framework for achieving probabilistic guarantees on black-box models. CP is generally applied to a model post-training. Recent research efforts, on the other hand, have focused on optimizing CP efficiency during training. We formalize this concept as the problem of conformal risk minimization (CRM). In this direction, conformal training (ConfTr) by Stutz et al. (2022) is a CRM technique that seeks to minimize the expected prediction set size of a model by simulating CP in-between training updates. In this paper, we provide a novel analysis for the ConfTr gradient estimation method, revealing a strong source of sample inefficiency that introduces training instability and limits its practical use. To address this challenge, we propose variance-reduced conformal training (VR-ConfTr), a CRM method that carefully incorporates a novel variance reduction technique in the gradient estimation of the ConfTr objective function. Through extensive experiments on various benchmark datasets, we demonstrate that VR-ConfTr consistently achieves faster convergence and smaller prediction sets compared to baselines
Submission Number: 105
Loading