Efficient Sampling for Doubly Stochastic Variational Inference in Deep Gaussian Processes Regression
Keywords: gaussian processes, regression, random fourier features, machine learning, variational inference, neural network
TL;DR: We present an efficient deep gaussian process that outperforms the SOTA doubly stochastic deep gaussian process both in performance and training overhead.
Abstract: Deep Gaussian Processes (DGPs) enhance Gaussian Processes (GPs) in function approximation through multi-layer stacking. However, the inference of DGPs presents challenges as it has no closed-form solution. Existing methods approximate the posterior of DGPs through independent sampling and variational inference. These approaches overlook the samples' correlations and face substantial computational overhead as layers increase, hindering performance improvements. We present Efficient Deep Gaussian Processes (EDGPs) that enable efficient sampling between inner layers while maintaining full covariance characteristics. Unlike existing methods that compromise accuracy for speed, EDGP achieves high efficiency without sacrificing precision. Experiments show that EDGP has comparable, or even better performance than state-of-the-art Doubly Stochastic Deep Gaussian Processes (DSDGPs) while training is almost as efficient as basic neural networks.
Supplementary Material: zip
Primary Area: General machine learning (supervised, unsupervised, online, active, etc.)
Submission Number: 9849
Loading