Improved Marginal Unbiased Score Expansion (MUSE) via Implicit DifferentiationDownload PDF

Published: 29 Nov 2022, Last Modified: 14 Apr 2024SBM 2022 PosterReaders: Everyone
Keywords: Bayesian inference, Hamiltonian Monte Carlo, Bayesian Neural Networks, Implicit Differentiation, Automatic Differentiation
TL;DR: We use implicit differentiation to improve the Marginal Unbiased Score Expansion (MUSE) algorithm, a score-based method for approximate Bayesian inference, and demonstrate orders of magnitude speedups over Hamiltonian Monte Carlo on several problems
Abstract: We apply the technique of implicit differentiation to boost performance, reduce numerical error, and remove required user-tuning in the Marginal Unbiased Score Expansion (MUSE) algorithm for hierarchical Bayesian inference. We demonstrate these improvements on three representative inference problems: 1) an extended Neal's funnel 2) Bayesian neural networks, and 3) probabilistic principal component analysis. On our particular test cases, MUSE with implicit differentiation is faster than Hamiltonian Monte Carlo by factors of 155, 397, and 5, respectively, or factors of 65, 278, and 1 without implicit differentiation, and yields good approximate marginal posteriors. The Julia and Python MUSE packages have been updated to use implicit differentiation, and can solve problems defined by hand or with any of a number of popular probabilistic programming languages and automatic differentiation backends.
Student Paper: No
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2209.10512/code)
1 Reply

Loading