Bayesian Learning via Neural Schrödinger-Föllmer FlowsDownload PDF

Published: 29 Jan 2022, Last Modified: 22 Oct 2023AABI 2022 PosterReaders: Everyone
Keywords: Stochastic Control, Bayesian Deep Learning, SGLD, Variational Inference with gaurantees, Stochastic flows.
TL;DR: In this paper we applies ideas the schrodinger bridge problem literature and stochastic control to sample from posteriors in Bayesian ML
Abstract: In this work we explore a new framework for approximate Bayesian inference in large datasets based on stochastic control. We advocate stochastic control as a finite time alternative to popular steady-state methods such as stochastic gradient Langevin dynamics (SGLD). Furthermore, we discuss and adapt the existing theoretical guarantees of this framework and establish connections to already existing VI routines in SDE-based models.
Reviewer: Francisco Vargas
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2111.10510/code)
1 Reply

Loading