Entropy Estimates for Generative ModelsDownload PDF

12 Feb 2018 (modified: 05 May 2023)ICLR 2018 Workshop SubmissionReaders: Everyone
Abstract: Different approaches to generative modeling entail different approaches to evaluation. While some models admit test likelihood estimation, for others only proxy metrics for visual quality are being reported. In this paper, we propose a simple method to compute differential entropy of an arbitrary decoder-based generative model. Using this approach, we found that models with qualitatively different samples are distinguishable in terms of entropy. In particular, adversarially trained generative models typically have higher entropy than variational autoencoders. Additionally, we provide support for the application of entropy as a measure of sample diversity.
TL;DR: A simple differential entropy estimator applied to comparison of generative models
Keywords: generative modeling, differential entropy, variational autoencoder, variational inference
5 Replies

Loading