PNF: Progressive normalizing flowsDownload PDF

Anonymous

02 Jun 2021 (modified: 05 May 2023)ICML 2021 Workshop INNF Blind SubmissionReaders: Everyone
Keywords: generative models, normalizing flows
TL;DR: Generative network that uses a new strategy to train flow-based models by progressively training at different image resolutions
Abstract: In this work, we introduce Progressive Normalizing Flows (PNF), a generative network that allows to model high-dimensional input data distributions by progressively training several flow-based modules. Competing generative models, such as GANs or autoencoders, do not aim at learning probability density of real data, while flow-based models realize this objective at a prohibitive cost of highly-dimensional internal representation. Here, we address these limitations and introduce a new strategy to train flow-based models. We progressively train consecutive models at increasing data resolutions, which allows to construct low-dimensional representations, as done in autoencoders, while directly approximating the log-likelihood function. Additional feature of our model is its intrinsic ability to upscale data resolution in the consecutive stages. We show that PNF offers a superior or comparable performance over the state of the art.
3 Replies

Loading