On Average-Case Error Bounds for Kernel-Based Bayesian Quadrature

Published: 07 Jul 2023, Last Modified: 07 Jul 2023Accepted by TMLREveryoneRevisionsBibTeX
Abstract: In this paper, we study error bounds for Bayesian quadrature (BQ), with an emphasis on noisy settings, randomized algorithms, and average-case performance measures. We seek to approximate the integral of functions in a Reproducing Kernel Hilbert Space (RKHS), particularly focusing on the Mat\'ern-$\nu$ and squared exponential (SE) kernels, with samples from the function potentially being corrupted by Gaussian noise. We provide a two-step meta-algorithm that serves as a general tool for relating the average-case quadrature error with the $L^2$-function approximation error. When specialized to the Mat\'ern kernel, we recover an existing near-optimal error rate while avoiding the existing method of repeatedly sampling points. When specialized to other settings, we obtain new average-case results for settings including the SE kernel with noise and the Mat\'ern kernel with misspecification. Finally, we present algorithm-independent lower bounds that have greater generality and/or give distinct proofs compared to existing ones.
Submission Length: Long submission (more than 12 pages of main content)
Code: https://github.com/caitree/Kernelized-Bayesian-Quadrature
Assigned Action Editor: ~Nishant_A_Mehta1
License: Creative Commons Attribution 4.0 International (CC BY 4.0)
Submission Number: 839