Entropic GANs meet VAEs: A Statistical Approach to Compute Sample Likelihoods in GANsDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Building on the success of deep learning, two modern approaches to learn a probability model of the observed data are Generative Adversarial Networks (GANs) and Variational AutoEncoders (VAEs). VAEs consider an explicit probability model for the data and compute a generative distribution by maximizing a variational lower-bound on the log-likelihood function. GANs, however, compute a generative model by minimizing a distance between observed and generated probability distributions without considering an explicit model for the observed data. The lack of having explicit probability models in GANs prohibits computation of sample likelihoods in their frameworks and limits their use in statistical inference problems. In this work, we show that an optimal transport GAN with the entropy regularization can be viewed as a generative model that maximizes a lower-bound on average sample likelihoods, an approach that VAEs are based on. In particular, our proof constructs an explicit probability model for GANs that can be used to compute likelihood statistics within GAN's framework. Our numerical results on several datasets demonstrate consistent trends with the proposed theory.
Keywords: GAN, VAE, likelihood estimation, statistical inference
TL;DR: A statistical approach to compute sample likelihoods in Generative Adversarial Networks
Code: [![github](/images/github_icon.svg) yogeshbalaji/EntropicGANs_meet_VAEs](https://github.com/yogeshbalaji/EntropicGANs_meet_VAEs)
Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10), [LSUN](https://paperswithcode.com/dataset/lsun), [SVHN](https://paperswithcode.com/dataset/svhn)
8 Replies

Loading