Keywords: Bayesian Quadrature, Kernel Quadrature, Gaussian Process, Active Learning, Model Evidence, Approximate Bayesian Computation
Abstract: Calculation of Bayesian posteriors and model evidences typically requires numerical integration.
Bayesian quadrature (BQ), a surrogate-model-based approach to numerical integration, is capable of superb sample efficiency, but its lack of parallelisation has hindered its practical applications.
In this work, we propose a parallelised (batch) BQ method, employing techniques from kernel quadrature, that possesses an empirically exponential convergence rate.
Additionally, just as with Nested Sampling, our method permits simultaneous inference of both posteriors and model evidence.
Samples from our BQ surrogate model are re-selected to give a sparse set of samples, via a kernel recombination algorithm, requiring negligible additional time to increase the batch size.
Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.
TL;DR: The integration of Bayesian Quadrature and Kernel Quadrature can quickly solve Bayesian inference for intractable likelihood via parallel computing.
Supplementary Material: pdf
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/fast-bayesian-inference-with-batch-bayesian/code)
18 Replies
Loading