Efficient Conformal Prediction with Order-Preserving Predictions for Classifiers

03 Sept 2025 (modified: 24 Nov 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: classification, predictive uncertainty, conformal prediction
TL;DR: We propose Flexible Prediction Sets, a framework that uses an order-preserving transform on model predictions to shrink conformal prediction sets while maintaining coverage guarantees.
Abstract: Conformal prediction provides prediction sets with distribution-free, finite-sample coverage guarantees for machine learning classifiers. Numerous methods reduce set size by retraining classifiers or designing novel non-conformity scores, but they often suffer from high computational cost or inflexibility. To address this issue, we propose Flexible Prediction Sets (FPS), a post-hoc framework that learns an order-preserving transformation which preserves the order of model's predicted class-probability while reshaping their magnitudes, enabling smaller conformal prediction sets. This transformation is obtained by optimizing a smooth surrogate of the set-size objective on a tuning dataset, then applied to the predicted class-probability before conformal calibration. This process yields smaller prediction sets while maintaining the coverage level. Theoretically, we prove coverage preservation under transformation, provide generalization bounds for the function class and surrogate risk, and show convergence to a stationary point. Empirically, extensive experiments on image and text benchmarks with multiple base machine learning classifiers demonstrate consistent reductions in set size at various nominal coverage rates, outperforming conformal prediction baselines.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 1275
Loading