WiSE-ALE: Wide Sample Estimator for Aggregate Latent EmbeddingDownload PDF

Published: 03 May 2019, Last Modified: 05 May 2023DeepGenStruct 2019Readers: Everyone
Keywords: Generative models, Latent variable models, Variational inference, Auto-encoder, Representation learning
TL;DR: We propose a new latent variable model to learn latent embeddings for some high-dimensional data.
Abstract: In this paper, we present a new generative model for learning latent embeddings. Compared to the classical generative process, where each observed data point is generated from an individual latent variable, our approach assumes a global latent variable to generate the whole set of observed data points. We then propose a learning objective that is derived as an approximation to a lower bound to the data log likelihood, leading to our algorithm, WiSE-ALE. Compared to the standard ELBO objective, where the variational posterior for each data point is encouraged to match the prior distribution, the WiSE-ALE objective matches the averaged posterior, over all samples, with the prior, allowing the sample-wise posterior distributions to have a wider range of acceptable embedding mean and variance and leading to better reconstruction quality in the auto-encoding process. Through various examples and comparison to other state-of-the-art VAE models, we demonstrate that WiSE-ALE has excellent information embedding properties, whilst still retaining the ability to learn a smooth, compact representation.
3 Replies

Loading