Restoration based Generative ModelsDownload PDF

Published: 01 Feb 2023, Last Modified: 12 Mar 2024Submitted to ICLR 2023Readers: Everyone
Keywords: Diffusion Generative Models, Image Restoration, Maximum a Posteriori
TL;DR: A new framework on generative modeling in the perspective of restoration.
Abstract: Denoising generative models (DGMs) have recently attracted increasing attention by showing impressive synthesis quality. DGMs are built on a diffusion process that pushes data to the noise distribution and the models learn to denoise. In this paper, we establish the interpretation of DGMs in terms of image restoration (IR). Integrating IR literature allows us to use an alternative objective and diverse forward processes, not confining to the diffusion process. By imposing prior knowledge on the loss function grounded on MAP estimation, we eliminate the need for the expensive sampling of DGMs. Also, we propose a multi-scale training, which alleviates the latent inefficiency of DGMs, by taking advantage of the flexibility of the forward process. Our model improves the quality and efficiency of both training and inference, achieving state-of-the-art performance when the number of forward steps is limited. Furthermore, we show the applicability of our model to inverse problems. We believe that our framework paves the way for designing a new type of flexible general generative model.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Generative models
Supplementary Material: zip
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:2303.05456/code)
18 Replies

Loading