Improved Inverse-Free Variational Bounds for Sparse Gaussian ProcessesDownload PDF

Published: 29 Jan 2022, Last Modified: 05 May 2023AABI 2022 PosterReaders: Everyone
Keywords: gaussian processes, GPs, sparse, variational
TL;DR: We provide convenient training objectives for GPs that do not require matrix inverses or other decompositions, but show that currently optimising the objective is hard.
Abstract: The need for matrix decompositions (inverses) is often named as a major impediment to scaling Gaussian process (GP) models, even in efficient approximations. To address this, Van der Wilk et al. (2020) introduced a variational lower bound that can be computed without these costly operations. We improve this bound by 1) simplifying it by removing the need for iterative procedures, and 2) making it more numerically stable. While these improvements do not result in a procedure that is faster in wall-clock time than existing variational bounds, they are likely to be necessary steps along the way.
1 Reply

Loading