Bayesian Quadrature for Neural Ensemble Search

Published: 25 Jul 2023, Last Modified: 25 Jul 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Ensembling can improve the performance of Neural Networks, but existing approaches struggle when the architecture likelihood surface has dispersed, narrow peaks. Furthermore, existing methods construct equally weighted ensembles, and this is likely to be vulnerable to the failure modes of the weaker architectures. By viewing ensembling as approximately marginalising over architectures we construct ensembles using the tools of Bayesian Quadrature -- tools which are well suited to the exploration of likelihood surfaces with dispersed, narrow peaks. Additionally, the resulting ensembles consist of architectures weighted commensurate with their performance. We show empirically -- in terms of test likelihood, accuracy, and expected calibration error -- that our method outperforms state-of-the-art baselines, and verify via ablation studies that its components do so independently.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: For the camera ready version: - The text colour of additions that addressed reviewer comments are changed from red to black. - Minor typographical errors are fixed.
Code: https://github.com/saadhamidml/bq-nes
Supplementary Material: zip
Assigned Action Editor: ~Kevin_Swersky1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 951
Loading