Investigating Shifts in GAN Output-DistributionsDownload PDF

09 Oct 2021, 14:49 (modified: 30 Nov 2021, 16:37)NeurIPS 2021 Workshop DistShift PosterReaders: Everyone
Keywords: GAN, distribution shift
TL;DR: Investigation of shifts between the distributrions of real training data and GAN generated data
Abstract: A fundamental and still largely unsolved question in the context of Generative Adversarial Networks is whether they are truly able to capture the real data distribution and, consequently, to sample from it. In particular, the multidimensional nature of image distributions leads to a complex evaluation of the diversity of GAN distributions. Existing approaches provide only a partial understanding of this issue, leaving the question unanswered. In this work, we introduce a loop-training scheme for the systematic investigation of observable shifts between the distributions of real training data and GAN generated data. Additionally, we introduce several bounded measures for distribution shifts, which are both easy to compute and to interpret. Overall, the combination of these methods allows an explorative investigation of innate limitations of current GAN algorithms. Our experiments on different data-sets and multiple state-of-the-art GAN architectures show large shifts between input and output distributions, showing that existing theoretical guarantees towards the convergence of output distributions appear not to be holding in practice.
1 Reply