TL;DR: Hierarchical generative model (hybrid of VAE and GAN) that learns a disentangled representation of data without compromising the generative quality.
Abstract: Learning disentangled representations of data is one of the central themes in unsupervised learning in general and generative modelling in particular. In this work, we tackle a slightly more intricate scenario where the observations are generated from a conditional distribution of some known control variate and some latent noise variate. To this end, we present a hierarchical model and a training method (CZ-GEM) that leverages some of the recent developments in likelihood-based and likelihood-free generative models. We show that by formulation, CZ-GEM introduces the right inductive biases that ensure the disentanglement of the control from the noise variables, while also keeping the components of the control variate disentangled. This is achieved without compromising on the quality of the generated samples. Our approach is simple, general, and can be applied both in supervised and unsupervised settings.
Code: https://github.com/AnonymousAuthors000/CZ-GEM
Keywords: disentangled representation learning, gan, generative model, simulator
Original Pdf: pdf
8 Replies
Loading