Improving Training of Likelihood-based Generative Models with Gaussian Homotopy

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML PosterEveryoneRevisionsBibTeX
Keywords: Generative Models, Normalizing Flows, Variational Autoencoders
Abstract: Generative Models (GMs) have recently gained popularity thanks to their success in various domains. In computer vision, for instance, they are able to generate astonishing realistic-looking images. Likelihood-based GMs are fast at generating new samples, given that they need a single model evaluation per sample, but their sample quality is usually lower than score-based Diffusion Models (DMs). In this work, we verify that the success of score-based DMs is in part due to the process of data smoothing, by incorporating this in the training of likelihood-based GMs. In the literature of optimization, this process of data smoothing is referred to as Gaussian homotopy (GH), and it has strong theoretical grounding. Crucially, GH does not incur computational overheads, and it can be implemented by adding one line of code in any training loop. We report results on various GMs, including Variational Autoencoders and Normalizing Flows, applied to image datasets demonstrating that GH enables significant improvements in sample quality.
Submission Number: 40