Learning Gibbs-regularized GANs with variational discriminator reparameterizationDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We propose a novel approach to regularizing generative adversarial networks (GANs) leveraging learned {\em structured Gibbs distributions}. Our method consists of reparameterizing the discriminator to be an explicit function of two densities: the generator PDF $q$ and a structured Gibbs distribution $\nu$. Leveraging recent work on invertible pushforward density estimators, this reparameterization is made possible by assuming the generator is invertible, which enables the analytic evaluation of the generator PDF $q$. We further propose optimizing the Jeffrey divergence, which balances mode coverage with sample quality. The combination of this loss and reparameterization allows us to effectively regularize the generator by imposing structure from domain knowledge on $\nu$, as in classical graphical models. Applying our method to a vehicle trajectory forecasting task, we observe that we are able to obtain quantitatively superior mode coverage as well as better-quality samples compared to traditional methods.
Keywords: deep generative models, graphical models, trajectory forecasting, GANs, density estimation, structured prediction
TL;DR: We reparameterize a GAN's discriminator into a form that admits regularization using a structured Gibbs distribution
8 Replies

Loading