Ensembles for Uncertainty Estimation: Benefits of Prior Functions and Bootstrapping

Published: 09 May 2023, Last Modified: 09 May 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: In machine learning, an agent needs to estimate uncertainty to efficiently explore and adapt and to make effective decisions. A common approach to uncertainty estimation maintains an ensemble of models. In recent years, several approaches have been proposed for training ensembles, and conflicting views prevail with regards to the importance of various ingredients of these approaches. In this paper, we aim to address the benefits of two ingredients -- prior functions and bootstrapping -- which have come into question. We show that prior functions can significantly improve an ensemble agent's joint predictions across inputs and that bootstrapping affords additional benefits if the signal-to-noise ratio varies across inputs. Our claims are justified by both theoretical and experimental results.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: We have incorporated the following changes based on the comments of the reviewers and the action editor: 1. included and discussed the references suggested by the reviewers 2. updated the introduction to clarify the importance of uncertainty estimation and the important role of studying ensemble agents in this field 3. updated the writing to further clarify some of the experimental results 4. updated the figures to contain error bars 5. updated the conclusion to include more discussion on potential extensions of this paper 6. rectified typos and grammatical mistakes in the earlier version of the paper Finally, we would like to thank the reviewers and the action editor for taking time and providing us with the helpful feedback.
Assigned Action Editor: ~Gergely_Neu1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 497
Loading