Quantifying Uncertainty in Foundation Models via EnsemblesDownload PDF

Published: 18 Nov 2022, Last Modified: 05 May 2023RobustSeq @ NeurIPS 2022 PosterReaders: Everyone
Keywords: fine-tuned ensembles, uncertainty quantification, model disagreements, foundation models
TL;DR: Investigation of ensembles for quantifying uncertainty in foundation models
Abstract: As large-scale foundation models begin to have increasing impact in real-world applications, to guarantee reliability and trustworthiness it is important for these models to "know what they don't know": to be capable of quantifying uncertainty about their own outputs. In this work, we propose disagreement of model ensembles as an effective and compute-efficient method to quantify uncertainty. We also conduct a systematic study of uncertainty quantification spanning multiple tasks - a synthetic string task, and natural language arithmetic and question-answering tasks - over a progression of increasingly out of distribution inputs. We find that considering ensemble disagreement results in improved uncertainty prediction over only considering a single model's likelihood. We hope that our investigation and results encourage more research in the area of uncertainty quantification in foundation models and the use of model ensembles.
0 Replies

Loading