Neural Variational Gradient DescentDownload PDF

Published: 29 Jan 2022, Last Modified: 22 Oct 2023AABI 2022 PosterReaders: Everyone
Keywords: Approximate inference, Bayesian deep learning, Stein variational gradient descent
Abstract: Particle-based approximate Bayesian inference approaches such as Stein Variational Gradient Descent (SVGD) combine the flexibility and convergence guarantees of sampling methods with the computational benefits of variational inference. In practice, SVGD relies on the choice of an appropriate kernel function, which impacts its ability to model the target distribution---a challenging problem with only heuristic solutions. We propose Neural Variational Gradient Descent (NVGD), which is based on parametrizing the witness function of the Stein discrepancy by a deep neural network whose parameters are learned in parallel to the inference, mitigating the necessity to make any kernel choices whatsoever. We empirically validate our method on synthetic and real-world inference problems.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2107.10731/code)
1 Reply

Loading