Laplace Approximated Gaussian Process State-Space ModelsDownload PDF

Published: 20 May 2022, Last Modified: 05 May 2023UAI 2022 OralReaders: Everyone
Keywords: Probabilistic Machine Learning, Gaussian Processes, State-Space Models, Laplace Approximation, Variational Inference
Abstract: Gaussian process state-space models describe time series data in a probabilistic and non-parametric manner using a Gaussian process transition function. As inference is intractable, recent methods use variational inference and either rely on simplifying independence assumptions on the approximate posterior or learn the temporal states iteratively. The latter hampers optimization since the posterior over the presence can only be learned once the posterior governing the past has converged. We present a novel inference scheme that applies stochastic variational inference for the Gaussian process posterior and the Laplace approximation on the temporal states. This approach respects the conditional dependencies in the model and, through the Laplace approximation, treats the temporal states jointly, thereby avoiding their sequential learning. Our method is computationally efficient and leads to better calibrated predictions compared to state-of-the art alternatives on synthetic data and on a range of benchmark datasets.
Supplementary Material: zip
4 Replies