Fully Bayesian Autoencoders with Latent Sparse Gaussian Processes

Published: 20 Jun 2023, Last Modified: 18 Jul 2023AABI 2023 - Fast TrackEveryoneRevisionsBibTeX
Keywords: Gaussian Processes, Latent Variable Models, Bayesian Autoencoders, Bayesian Inference
Abstract: Autoencoders and their variants are among the most widely used models in representation learning and generative modeling. However, autoencoder-based models usually assume that the learned representations are i.i.d. and fail to capture the correlations between the data samples. To address this issue, we propose a novel Sparse Gaussian Process Bayesian Autoencoder (SGP-BAE) model in which we impose fully Bayesian sparse Gaussian Process priors on the latent space of a Bayesian Autoencoder. We perform posterior estimation for this model via stochastic gradient Hamiltonian Monte Carlo. We evaluate our approach qualitatively and quantitatively on a wide range of representation learning and generative modeling tasks and show that our approach consistently outperforms multiple alternatives relying on Variational Autoencoders.
Publication Venue: ICML 2023
Submission Number: 15