MIM: Mutual Information Machine

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • Keywords: Mutual Information, Representation Learning, Generative Models, Probability Density Estimator
  • TL;DR: We propose an alternative latent variable modelling framework to variational auto-encoders that encourages the principles of symmetry and high mutual information.
  • Abstract: We introduce the Mutual Information Machine (MIM), an autoencoder framework for learning joint distributions over observations and latent states. The model formulation reflects two key design principles: 1) symmetry, to encourage the encoder and decoder to learn different factorizations of the same underlying distribution; and 2) mutual information, to encourage the learning of useful representations for downstream tasks. The objective comprises the Jensen-Shannon divergence between the encoding and decoding joint distributions, plus a mutual information regularizer. We show that this can be bounded by a tractable cross-entropy loss between the true model and a parameterized approximation, and relate this to maximum likelihood estimation and variational autoencoders. Experiments show that MIM is capable of learning a latent representation with high mutual information, and good unsupervised clustering, while providing NLL comparable to VAE (with a sufficiently expressive architecture).
  • Code: https://www.dropbox.com/s/idnls2layat77sj/MIM-master.zip?dl=1
  • Original Pdf:  pdf
0 Replies

Loading