Accurate and Scalable Stochastic Gaussian Process Regression via Learnable Coreset-based Variational Inference

Published: 07 May 2025, Last Modified: 13 Jun 2025UAI 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gaussian process, variational inference, inducing point methods
Abstract: We introduce a novel stochastic variational inference method for Gaussian process ($\mathcal{GP}$) regression, by deriving a posterior over a learnable set of coresets: i.e., over pseudo-input/output, weighted pairs. Unlike former free-form variational families for stochastic inference, our coreset-based variational $\mathcal{GP}$ (CVGP) is defined in terms of the $\mathcal{GP}$ prior and the (weighted) data likelihood. This formulation naturally incorporates inductive biases of the prior, and ensures its kernel and likelihood dependencies are shared with the posterior. We derive a variational lower-bound on the log-marginal likelihood by marginalizing over the latent $\mathcal{GP}$ coreset variables, and show that CVGP's lower-bound is amenable to stochastic optimization. CVGP reduces the dimensionality of the variational parameter search space to linear $\mathcal{O}(M)$ complexity, while ensuring numerical stability at $\mathcal{O}(M^3)$ time complexity and $\mathcal{O}(M^2)$ space complexity. Evaluations on real-world and simulated regression problems demonstrate that CVGP achieves superior inference and predictive performance than state-of-the-art, stochastic sparse $\mathcal{GP}$ approximation methods.
Latex Source Code: zip
Code Link: https://github.com/iurteagalab/cvgp_regression
Signed PMLR Licence Agreement: pdf
Readers: auai.org/UAI/2025/Conference, auai.org/UAI/2025/Conference/Area_Chairs, auai.org/UAI/2025/Conference/Reviewers, auai.org/UAI/2025/Conference/Submission554/Authors, auai.org/UAI/2025/Conference/Submission554/Reproducibility_Reviewers
Submission Number: 554
Loading