Keywords: Conformal Prediction
Abstract: Conformal prediction (CP) constructs prediction sets with a marginal coverage guarantee of $1 - \alpha$, assuming the calibration distribution $P_{XY}$ and test distribution $Q_{XY}$ are identical. Under distribution shift, existing approaches align calibration and test conformal scores only at the marginal level, which helps preserve marginal coverage. However, ignoring their mismatched conditional score distributions can lead to poor conditional coverage at individual test inputs. In response, we introduce the conditional coverage gap (CCG) and its expectation over $Q_X$ to quantify the robustness of the conditional guarantee. To study how a distribution shift is propagated from data to conformal scores, we use the Wasserstein distance between $P_{XY}$ and $Q_{XY}$ to bound the expected CCG. This bound implies that an invertible transformation between $P_{XY}$ and $Q_{XY}$ via Wasserstein minimization can promote robust conditional coverage. Lastly, we implement the idea by Branched Normalizing Flow (BNF), a two-branch structure where the $X$-branch transports test inputs from $Q_X$ to $P_X$ to obtain prediction sets with conditional guarantee on $P_{Y|X}$, and the $Y$-branch inversely maps these sets with preserved conditional guarantee on $Q_{Y|X}$. Extensive experiments on nine datasets demonstrate that BNF consistently reduces CCG with improved coverage robustness across various confidence levels under distribution shift.
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 3637
Loading