gp2Scale: A Class of Compactly-Supported Non-Stationary Kernels and Distributed Computing for Exact Gaussian Processes on 10 Million Data Points

10 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gaussian Process, HPC, Kernels
TL;DR: The paper proposed a method to scale up exact Gaussian Processes with the help of non-stationary, compactly supported kernels and HPC.
Abstract: Despite a large corpus of recent work on scaling up Gaussian processes, a stubborn trade-off between computational speed, prediction and uncertainty quantification accuracy, and customizability remains. This is because the vast majority of existing methodologies exploit various levels of approximations that lower accuracy and limit the flexibility of kernel and noise-model designs --- an unacceptable drawback at a time when expressive non-stationary kernels are on the rise in many fields. Here, we propose a methodology we term \emph{gp2Scale} that allows us to scale exact Gaussian processes to more than 10 million data points without relying on approximations, but instead by working with the existing capabilities of a GP: the kernel design. Highly flexible, compactly supported, and non-stationary kernels lead to the identification of naturally occurring sparse structure in the covariance matrix, which is then exploited for the calculations of the linear system solution and the log-determinant for training. We demonstrate our method's functionality on several real-life datasets and present comparisons to state-of-the-art approximation algorithms. Although we show superiority in approximation performance in many cases, the method's real power lies in the total agnosticism regarding arbitrary GP customizations --- core kernel design, noise, and mean functions --- or the type of input space, making the method optimally suited for modern Gaussian process applications.
Supplementary Material: zip
Primary Area: probabilistic methods (Bayesian methods, variational inference, sampling, UQ, etc.)
Submission Number: 3787
Loading