Observation-Guided Diffusion Probabilistic Models

Published: 01 Jan 2024, Last Modified: 06 Nov 2024CVPR 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We propose a novel diffusion-based image generation method called the observation-guided diffusion probabilis-tic model (OGDM), which effectively addresses the trade-off between quality control and fast sampling. Our approach reestablishes the training objective by integrating the guidance of the observation process with the Markov chain in a principled way. This is achieved by introducing an additional loss term derived from the observation based on a conditional discriminator on noise level, which employs a Bernoulli distribution indicating whether its in-put lies on the (noisy) real manifold or not. This strat-egy allows us to optimize the more accurate negative log-likelihood induced in the inference stage especially when the number offunction evaluations is limited. The proposed training scheme is also advantageous even when incorpo-rated only into the fine-tuning process, and it is compati-ble with various fast inference strategies since our method yields better denoising networks using the exactly the same inference procedure without incurring extra computational cost. We demonstrate the effectiveness of our training al-gorithm using diverse inference techniques on strong dif-fusion model baselines. Our implementation is available at https://github.com/Junoh-Kang/OGDM_edm.
Loading