Conditioning Sparse Variational Gaussian Processes for Online Decision-makingDownload PDF

May 21, 2021 (edited Jan 22, 2022)NeurIPS 2021 PosterReaders: Everyone
  • Keywords: Gaussian processes, Bayesian optimization, stochastic variational Gaussian processes
  • TL;DR: We develop a way for stochastic variational Gaussian processes to fantasize with respect to new data, enabling them to be used in lookahead acquisitions for Bayesian optimization.
  • Abstract: With a principled representation of uncertainty and closed form posterior updates, Gaussian processes (GPs) are a natural choice for online decision making. However, Gaussian processes typically require at least $\mathcal{O}(n^2)$ computations for $n$ training points, limiting their general applicability. Stochastic variational Gaussian processes (SVGPs) can provide scalable inference for a dataset of fixed size, but are difficult to efficiently condition on new data. We propose online variational conditioning (OVC), a procedure for efficiently conditioning SVGPs in an online setting that does not require re-training through the evidence lower bound with the addition of new data. OVC enables the pairing of SVGPs with advanced look-ahead acquisition functions for black-box optimization, even with non-Gaussian likelihoods. We show OVC provides compelling performance in a range of applications including active learning of malaria incidence, and reinforcement learning on MuJoCo simulated robotic control tasks.
  • Supplementary Material: pdf
  • Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
  • Code: https://github.com/wjmaddox/online_vargp
18 Replies

Loading