Generative Latent FlowDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Generative Model, Auto-encoder, Normalizing Flow
TL;DR: We propose a generative model that combines deterministic Auto-encoders and normalizing flows, and we show that our model's sample quality greatly outperforms that of other AE based generative models.
Abstract: In this work, we propose the Generative Latent Flow (GLF), an algorithm for generative modeling of the data distribution. GLF uses an Auto-encoder (AE) to learn latent representations of the data, and a normalizing flow to map the distribution of the latent variables to that of simple i.i.d noise. In contrast to some other Auto-encoder based generative models, which use various regularizers that encourage the encoded latent distribution to match the prior distribution, our model explicitly constructs a mapping between these two distributions, leading to better density matching while avoiding over regularizing the latent variables. We compare our model with several related techniques, and show that it has many relative advantages including fast convergence, single stage training and minimal reconstruction trade-off. We also study the relationship between our model and its stochastic counterpart, and show that our model can be viewed as a vanishing noise limit of VAEs with flow prior. Quantitatively, under standardized evaluations, our method achieves state-of-the-art sample quality and diversity among AE based models on commonly used datasets, and is competitive with GANs' benchmarks.
Code: https://drive.google.com/file/d/1FsNM1CxeS9Ntg5jBGWTSPI_6CXRzsZAL/view?usp=sharing
Original Pdf: pdf
10 Replies

Loading