A Unified Comparative Study with Generalized Conformity Scores for Multi-Output Conformal Regression

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: We present a unified study comparing multi-output conformal regression methods and introducing two generalizations of univariate conformity scores.
Abstract: Conformal prediction provides a powerful framework for constructing distribution-free prediction regions with finite-sample coverage guarantees. While extensively studied in univariate settings, its extension to multi-output problems presents additional challenges, including complex output dependencies and high computational costs, and remains relatively underexplored. In this work, we present a unified comparative study of nine conformal methods with different multivariate base models for constructing multivariate prediction regions within the same framework. This study highlights their key properties while also exploring the connections between them. Additionally, we introduce two novel classes of conformity scores for multi-output regression that generalize their univariate counterparts. These scores ensure asymptotic conditional coverage while maintaining exact finite-sample marginal coverage. One class is compatible with any generative model, offering broad applicability, while the other is computationally efficient, leveraging the properties of invertible generative models. Finally, we conduct a comprehensive empirical evaluation across 13 tabular datasets, comparing all the multi-output conformal methods explored in this work. To ensure a fair and consistent comparison, all methods are implemented within a unified code base.
Lay Summary: When an AI makes a prediction, how certain can we be? For complex tasks, like forecasting several related economic indicators at once, we need more than a single best guess, instead we need a reliable "range of possibilities." However, creating these reliable ranges for multiple predictions simultaneously is a difficult and relatively unexplored problem, with no clear guidance on which technique works best. We addressed this by first conducting a large-scale, fair comparison of nine existing methods to see how they stack up. Then, we designed two new, improved approaches as generalizations of single-prediction methods. Our first new method is highly flexible and works with many different types of AI models, while our second is designed to be computationally efficient. Our work provides a clear roadmap for researchers, helping them choose the right tool for building more trustworthy AI systems that can accurately report their own uncertainty. Our new techniques offer powerful and practical options, making it easier to develop reliable models for complex, real-world prediction challenges.
Link To Code: https://github.com/Vekteur/multi-output-conformal-regression
Primary Area: Probabilistic Methods->Everything Else
Keywords: Uncertainty Quantification, Multivariate regression, Conformal Prediction, Generative Models
Submission Number: 10592
Loading