GILBO: One Metric to Measure Them All

Alexander A. Alemi, Ian Fischer

Feb 12, 2018 ICLR 2018 Workshop Submission readers: everyone Show Bibtex
  • Abstract: We propose a simple, tractable lower bound on the mutual information contained in the joint generative density of any latent variable generative model: the GILBO (Generative Information Lower BOund). It offers a data independent measure of the complexity of the learned latent variable description, giving the log of the effective description length. It is well-defined for both VAEs and GANs. We compute the GILBO for 800 GANs and VAE s trained on MNIST and discuss the results.
  • Keywords: GAN, VAE, mutual information, metric
  • TL;DR: We introduce an information theoretic metric for evaluating GANs and VAEs.
0 Replies

Loading