BARNN: A Bayesian Autoregressive and Recurrent Neural Network

Published: 01 May 2025, Last Modified: 18 Jun 2025ICML 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
TL;DR: BARNN is a Bayesian framework that converts any autoregressive or recurrent neural network into a Bayesian version, improving performance and uncertainty quantification with minimal model changes
Abstract: Autoregressive and recurrent networks have achieved remarkable progress across various fields, from weather forecasting to molecular generation and Large Language Models. Despite their strong predictive capabilities, these models lack a rigorous framework for addressing uncertainty, which is key in scientific applications such as PDE solving, molecular generation and machine l earning Force Fields. To address this shortcoming we present BARNN: a variational Bayesian Autoregressive and Recurrent Neural Network. BARNNs aim to provide a principled way to turn any autoregressive or recurrent model into its Bayesian version. BARNN is based on the variational dropout method, allowing to apply it to large recurrent neural networks as well. We also introduce a temporal version of the “Variational Mixtures of Posteriors” prior (tVAMP-prior) to make Bayesian inference efficient and well-calibrated. Extensive experiments on PDE modelling and molecular generation demonstrate that BARNN not only achieves comparable or superior accuracy compared to existing methods, but also excels in uncertainty quantification and modelling long-range dependencies.
Lay Summary: Sequence models in deep learning drive advances in areas like weather forecasting and molecular design, but they lack reliable ways to quantify uncertainty, which is crucial for scientific applications. We introduce BARNN, a Bayesian approach that equips these models with well-calibrated uncertainty estimates, enabling more trustworthy predictions in scientific applications.
Link To Code: https://github.com/dario-coscia/barnn
Primary Area: Probabilistic Methods->Bayesian Models and Methods
Keywords: Bayesian Modelling, Variational Inference, Autoregressive Neural Networks, Recurrent Neural Networks, Uncertainty Quantification, Scientific Machine Learning
Submission Number: 12620
Loading