Stochastic Gradient MCMC for Gaussian Process Inference on Massive Geostatistical Data

Published: 10 Oct 2024, Last Modified: 06 Dec 2024NeurIPS BDU Workshop 2024 PosterEveryoneRevisionsBibTeXCC BY-ND 4.0
Keywords: Stochastic gradient descent, SGLD, SGRLD, MCMC, Gaussian Process, scalable methods
TL;DR: Bayesian optimization for spatial GP models using stochastic gradient Riemannian Langevin Dynamics and the Vecchia approximation.
Abstract: Gaussian processes (GPs) are the workhorses of spatial data analyses, but are difficult to scale to large spatial datasets. The Vecchia approximation induces sparsity in the dependence structure and is one of several methods proposed to scale GP inference. We develop a stochastic gradient Markov chain Monte Carlo framework for efficient computation in GPs for spatial data. At each step, the algorithm subsamples a minibatch of locations and subsequently updates process parameters through stochastic gradient Riemannian Langevin dynamics (SGRLD) on a Vecchia-approximated GP likelihood. We are able to conduct full Bayesian analysis for GPs with up to 100,000 locations using our spatial SGRLD, and demonstrate its efficacy through numerical studies and an application using ocean temperature data.
Submission Number: 19
Loading