Learning Generative Models using Denoising Density EstimatorsDownload PDF

25 Sept 2019 (modified: 28 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Original Pdf: pdf
Code: https://drive.google.com/file/d/1EzKRxnFG1Hd8g6Ggvt-jvKkgpDDwK2bY
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/arxiv:2001.02728/code)
TL;DR: A novel approach to train generative models including density estimation; different from normalizing and continuous flows, VAEs, or autoregressive models.
Abstract: Learning generative probabilistic models that can estimate the continuous density given a set of samples, and that can sample from that density is one of the fundamental challenges in unsupervised machine learning. In this paper we introduce a new approach to obtain such models based on what we call denoising density estimators (DDEs). A DDE is a scalar function, parameterized by a neural network, that is efficiently trained to represent a kernel density estimator of the data. In addition, we show how to leverage DDEs to develop a novel approach to obtain generative models that sample from given densities. We prove that our algorithms to obtain both DDEs and generative models are guaranteed to converge to the correct solutions. Advantages of our approach include that we do not require specific network architectures like in normalizing flows, ODE solvers as in continuous normalizing flows, nor do we require adversarial training as in generative adversarial networks (GANs). Finally, we provide experimental results that demonstrate practical applications of our technique.
Keywords: generative probabilistic models, denoising autoencoders, neural density estimation
15 Replies