Scalable Bayesian Monte Carlo: fast uncertainty estimation beyond deep ensembles

ICLR 2026 Conference Submission20890 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Bayesian deep learning, MCMC, SMC, ensembles
TL;DR: New BDL method SBMC interpolates between MAP estimator and posterior, and delivers good accuracy and uncertainty with comparable run-time to deep ensembles.
Abstract: This work introduces a new method designed for Bayesian deep learning called scalable Bayesian Monte Carlo (SBMC). The method is comprised of a model and an algorithm. The model interpolates between a point estimator and the posterior. The algorithm is a parallel implementation of sequential Monte Carlo sampler ($SMC_\parallel$) or Markov chain Monte Carlo ($MCMC_\parallel$). We collectively refer to these consistent (asymptotically unbiased) algorithms as Bayesian Monte Carlo (BMC), and any such algorithm can be used in our SBMC method. The utility of the method is demonstrated on practical examples: MNIST, CIFAR, IMDb. A systematic numerical study reveals that for the same wall-clock time as state-of-the-art (SOTA) methods like deep ensembles (DE), SBMC achieves comparable or better accuracy and substantially improved uncertainty quantification (UQ)--in particular, epistemic UQ. The benefit is demonstrated on the downstream task of estimating the confidence in predictions, which can be used for reliability assessment or abstention decisions. Code is available in the supplementary material.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 20890
Loading