Predictive Inference Is Really Free with In-Context Learning

Published: 05 Mar 2025, Last Modified: 19 Mar 2025QUESTION PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: In-context Learning, Transformers, Regression, Predictive Inference, Conformal Prediction
TL;DR: We propose a method for constructing prediction intervals called in-context Jackknife+, by using a meta-learned transformer trained via in-context learning to perform training-free leave- one-out (LOO) predictions
Abstract: In this work, we consider the problem of constructing PIs for point predictions that are obtained using transformers. We propose a novel method for constructing PIs called in-context Jackknife+ (ICJ+), by using a meta-learned transformer trained via ICL to perform training-free leave-one-out (LOO) predictions, i.e., by only prompting the transformer with LOO datasets and no retraining. We provide distribution-free coverage guarantees for our proposed ICJ+ algorithm under mild assumptions, by leveraging the stability of in-context trained transformers. We evaluate the coverage and width of the intervals obtained using ICJ+ on synthetic i.i.d. data for five classes of functions, and observe that their performance is comparable or superior to the benchmark J+ and true confidence intervals.
Submission Number: 32
Loading