Abstract: Current methods for stochastic hyperparameter learning in Gaussian Processes (GPs) rely
on approximations, such as computing biased stochastic gradients or using inducing points in
stochastic variational inference. However, when using such methods we are not guaranteed
to converge to a stationary point of the true marginal likelihood. In this work, we propose
algorithms for exact stochastic inference of GPs with kernels that induce a Reproducing
Kernel Hilbert Space (RKHS) of moderate finite dimension. Our approach can also be
extended to infinite dimensional RKHSs at the cost of forgoing exactness. Both for finite and
infinite dimensional RKHSs, our method achieves better experimental results than existing
methods when memory resources limit the feasible batch size and the possible number of
inducing points.
Submission Length: Regular submission (no more than 12 pages of main content)
Assigned Action Editor: ~Geoff_Pleiss1
Submission Number: 5570
Loading