Fast Bayesian Inference with Batch Bayesian Quadrature via Kernel RecombinationDownload PDF

Published: 31 Oct 2022, 18:00, Last Modified: 10 Oct 2022, 16:39NeurIPS 2022 AcceptReaders: Everyone
Keywords: Bayesian Quadrature, Kernel Quadrature, Gaussian Process, Active Learning, Model Evidence, Approximate Bayesian Computation
TL;DR: The integration of Bayesian Quadrature and Kernel Quadrature can quickly solve Bayesian inference for intractable likelihood via parallel computing.
Abstract: Calculation of Bayesian posteriors and model evidences typically requires numerical integration. Bayesian quadrature (BQ), a surrogate-model-based approach to numerical integration, is capable of superb sample efficiency, but its lack of parallelisation has hindered its practical applications. In this work, we propose a parallelised (batch) BQ method, employing techniques from kernel quadrature, that possesses an empirically exponential convergence rate. Additionally, just as with Nested Sampling, our method permits simultaneous inference of both posteriors and model evidence. Samples from our BQ surrogate model are re-selected to give a sparse set of samples, via a kernel recombination algorithm, requiring negligible additional time to increase the batch size. Empirically, we find that our approach significantly outperforms the sampling efficiency of both state-of-the-art BQ techniques and Nested Sampling in various real-world datasets, including lithium-ion battery analytics.
Supplementary Material: pdf
18 Replies