Deep Bayesian Neural Nets as Deep Matrix Gaussian ProcessesDownload PDF

25 Apr 2024 (modified: 17 Feb 2016)ICLR 2016 workshop submissionReaders: Everyone
Abstract: We show that by employing a distribution over random matrices, the matrix variate Gaussian~\cite{gupta1999matrix}, for the neural network parameters we can obtain a non-parametric interpretation for the hidden units after the application of the ``local reprarametrization trick"~\citep{kingma2015variational}. This provides a nice duality between Bayesian neural networks and deep Gaussian Processes~\cite{damianou2012deep}, a property that was also shown by~\cite{gal2015dropout}. We show that we can borrow ideas from the Gaussian Process literature so as to exploit the non-parametric properties of such a model. We empirically verified this model on a regression task.
Conflicts: uva.nl
7 Replies

Loading