Keywords: conformal prediction
TL;DR: Theoretical justifications for full conformal prediction with stochastic score is either too strict or incorrect. We provide a more general but still correct sufficient condition.
Abstract: The theory of full conformal prediction uses deterministic non-conformity measure, but modern usage of full conformal prediction often relies on machine learning training, making stochasticity inevitable. A simple sufficient condition of almost sure permutation invariance of the non-conformity measure can be too restrictive, so many have suggested the relaxation to permutation in distribution as a condition for full conformal prediction validity. We, however, show that this commonly known condition is actually insufficient. We then provide a correct sufficient condition: \emph{Conditional Independence \& Permutation Invariance in Distribution}, which encompasses several stochastic settings that may be used in machine learning.
Submission Number: 34
Loading