Exploring the Link Between Out-of-Distribution Detection and Conformal Prediction with Illustrations of Its Benefits

TMLR Paper4093 Authors

30 Jan 2025 (modified: 09 May 2025)Rejected by TMLREveryoneRevisionsBibTeXCC BY 4.0
Abstract: Research on Out-Of-Distribution (OOD) detection focuses mainly on building scores that efficiently distinguish OOD data from In Distribution (ID) data. On the other hand, Conformal Prediction (CP) uses non-conformity scores to construct prediction sets with probabilistic coverage guarantees. In other words, the former designs scores, while the latter designs probabilistic guarantees based on scores. Therefore, we claim that these two fields might be naturally intertwined. This work advocates for cross-fertilization between OOD and CP by formalizing their link and emphasizing two benefits of using them jointly. First, we show that in standard OOD benchmark settings, evaluation metrics can be overly optimistic due to the test dataset's finite sample size. Based on the work of Bates et al. 2022, we define new *conformal AUROC* and *conformal FPR@TPR95* metrics, which are corrections that provide probabilistic conservativeness guarantees on the variability of these metrics. We show the effect of these corrections on two reference OOD and anomaly detection benchmarks, OpenOOD Yang et al. 2022 and ADBench Han et al. 2022. Second, we explore using OOD scores as non-conformity scores and show that they can improve the efficiency of the prediction sets obtained with CP.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission:

All changes are found in blue in the pdf.

Assigned Action Editor: Ofir Lindenbaum
Submission Number: 4093
Loading