Isolating Sources of Disentanglement in Variational AutoencodersDownload PDF

12 Feb 2018 (modified: 04 Jun 2018)ICLR 2018 Workshop SubmissionReaders: Everyone
  • Keywords: Representation Learning, Variational Learning, Variational Autoencoder, Disentangled Representations
  • TL;DR: We propose a novel model and metric to learn and evaluate the disentanglement quality of representations.
  • Abstract: We decompose the evidence lower bound (ELBO) to show the existence of a total correlation term between latents. This motivates our beta-TCVAE (Total Correlation Variational Autoencoder), a refinement of the state-of-the-art beta-VAE for learning disentangled representations without supervision. We further propose a principled classifier-free measure of disentanglement called the Mutual Information Gap (MIG). We show a strong relationship between total correlation and disentanglement.
5 Replies

Loading