Warm Start Marginal Likelihood Optimisation for Iterative Gaussian Processes

Published: 27 May 2024, Last Modified: 28 May 2024AABI 2024EveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gaussian process, marginal likelihood, optimisation, linear system
TL;DR: We reuse intermediate solutions as initialisations to accelerate marginal likelihood optimisation in iterative Gaussian processes.
Abstract: Gaussian processes are a versatile probabilistic machine learning model whose effectiveness often depends on good hyperparameters, which are typically learned by maximising the marginal likelihood. In this work, we consider iterative methods, which use iterative linear system solvers to approximate marginal likelihood gradients up to a specified numerical precision, allowing a trade-off between compute time and accuracy of a solution. We introduce a three-level hierarchy of marginal likelihood optimisation for iterative Gaussian processes, and identify that the computational costs are dominated by solving sequential batches of large positive-definite systems of linear equations. We then propose to amortise computations by reusing solutions of linear system solvers as initialisations in the next step, providing a $\textit{warm start}$. Finally, we discuss the necessary conditions and quantify the consequences of warm starts and demonstrate their effectiveness on regression tasks, where warm starts achieve the same results as the conventional procedure while providing up to a $16 \times$ average speed-up among datasets.
Submission Number: 16
Loading