Isolating Sources of Disentanglement in Variational AutoencodersDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: We decompose the evidence lower bound (ELBO) to show the existence of a total correlation term between latents. This motivates our beta-TCVAE (Total Correlation Variational Autoencoder), a refinement of the state-of-the-art beta-VAE for learning disentangled representations without supervision. We further propose a principled classifier-free measure of disentanglement called the Mutual Information Gap (MIG). We show a strong relationship between total correlation and disentanglement.
TL;DR: We propose a novel model and metric to learn and evaluate the disentanglement quality of representations.
Keywords: Representation Learning, Variational Learning, Variational Autoencoder, Disentangled Representations
5 Replies