Stein Variational Gradient Descent for Approximate Bayesian ComputationDownload PDF

16 Oct 2019 (modified: 05 May 2023)AABI 2019Readers: Everyone
Keywords: Approximate Bayesian Computation, variational inference, Stein variational gradient descent, energy distance
Abstract: Approximate Bayesian Computation (ABC) provides a generic framework of Bayesian inference for likelihood-free models, but sampling based posterior approximation is often time-consuming and has difficulty accessing the convergence. Stochastic variational inference forms the posterior inference to a optimization problem and enable the ABC scalable for large dataset. However, complex simulation models involved in ABC always lead to complex posteriors, which is not easy to approximate by simple parametric variational distributions. We draw upon recent advances in the implicit model of variational distribution and introduce the Stein variational gradient descent (SVGD) approach to approximate the posterior by nonparametric particles. We also find that the kernel in the SVGD algorithm helps in reducing the large variance of the gradient estimators of ABC likelihood. Moreover, energy distance is proposed as the statistics in the evaluation of ABC likelihood, which reduce the difficulty in selecting proper statistics. Simulation studies are provided to demonstrate the correctness and efficiency of our algorithm.
0 Replies

Loading