Keywords: Deep Probabilistic Models, Inhomogeneous Subjects, Regularization, Latent Representation, Model Expressiveness
TL;DR: We develop a novel deep probabilistic model based on the recently proposed Q-exponential process which generalizes deep Gaussian process and excels in modeling inhomogeneous subjects.
Abstract: Motivated by deep neural networks, the deep Gaussian process (DGP) generalizes the standard GP by stacking multiple layers of GPs. Despite the enhanced expressiveness, GP, as an $L_2$ regularization prior, tends to be over-smooth and sub-optimal for inhomogeneous objects, such as images with edges. Recently, Q-exponential process (Q-EP) has been proposed as an $L_q$ relaxation to GP and demonstrated with more desirable regularization properties through a parameter $q>0$ with $q=2$ corresponding to GP. Sharing the similar tractability of posterior and predictive distributions with GP, Q-EP can also be stacked to improve its modeling flexibility.
In this paper, we generalize Q-EP to deep Q-EP to model inhomogeneous data with improved expressiveness. We introduce shallow Q-EP as a latent variable model and then build a hierarchy of the shallow Q-EP layers.
Sparse approximation by inducing points and scalable variational strategy are applied to facilitate the inference.
We demonstrate the numerical advantages of the proposed deep Q-EP model by comparing with multiple state-of-the-art deep probabilistic models.
Submission Number: 2
Loading