Localizing and Amortizing: Efficient Inference for Gaussian ProcessesDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
TL;DR: A scalable variational inference for GP leveraging nearest neighbors and amortization.
Abstract: The inference of Gaussian Processes concerns the distribution of the underlying function given observed data points. GP inference based on local ranges of data points is able to capture fine-scale correlations and allow fine-grained decomposition of the computation. Following this direction, we propose a new inference model that considers the correlations and observations of the K nearest neighbors for the inference at a data point. Compared with previous works, we also eliminate the data ordering prerequisite to simplify the inference process. Additionally, the inference task is decomposed to small subtasks with several technique innovations, making our model well suits the stochastic optimization. Since the decomposed small subtasks have the same structure, we further speed up the inference procedure with amortized inference. Our model runs efficiently and achieves good performances on several benchmark tasks.
Code: https://www.dropbox.com/sh/edl3r5hyndu9too/AABu_mU8EvRMzEQlYXM2u3C9a?dl=0
Keywords: Gaussian Processes, Variational Inference, Amortized Inference, Nearest Neighbors
Original Pdf: pdf
11 Replies

Loading