Keywords: Uncertainty Quantification, Uncertainty Disentanglement, Aleatoric Uncertainty, Epistemic Uncertainty, Abstained Prediction, Out-of-Distribution Detection
TL;DR: We evaluate recent uncertainty quantifiers on various practical tasks to determine if they can provide disentangled uncertainty estimates.
Abstract: Uncertainty quantification, once a singular task, has evolved into a spectrum of tasks, including abstained prediction, out-of-distribution detection, and aleatoric uncertainty quantification. The latest goal is disentanglement: the construction of multiple estimators that are each tailored to one and only one source of uncertainty. This paper evaluates a wide spectrum of Bayesian, evidential, and deterministic methods across various uncertainty tasks on ImageNet. We find that, despite promising theoretical endeavors, disentanglement is not yet achieved in practice. Further, we reveal which uncertainty estimators excel at which specific tasks, providing insights for practitioners and guiding future research toward task-centric and disentangled uncertainty estimation methods. Our code is available at https://anonymous.4open.science/r/bud-ED1B/.
Submission Number: 68
Loading