Isolating Sources of Disentanglement in Variational Autoencoders

Tian Qi Chen, Xuechen Li, Roger Grosse, David Duvenaud

Feb 12, 2018 (modified: Jun 04, 2018) ICLR 2018 Workshop Submission readers: everyone Show Bibtex
  • Abstract: We decompose the evidence lower bound (ELBO) to show the existence of a total correlation term between latents. This motivates our beta-TCVAE (Total Correlation Variational Autoencoder), a refinement of the state-of-the-art beta-VAE for learning disentangled representations without supervision. We further propose a principled classifier-free measure of disentanglement called the Mutual Information Gap (MIG). We show a strong relationship between total correlation and disentanglement.
  • Keywords: Representation Learning, Variational Learning, Variational Autoencoder, Disentangled Representations
  • TL;DR: We propose a novel model and metric to learn and evaluate the disentanglement quality of representations.
0 Replies

Loading