Classifier-Driven Diffusion Model and Plug-and-Play of Weakly-Supervised Learning for Conditional Generation
Keywords: Diffusion model, weakly supervised learning
TL;DR: We propose novel diffusion model trained with multi-class classification and plug-and-play of weakly-supervised learning for conditional generation
Abstract: Can a diffusion model for conditional generation be trained as a classifier? We address this question with the Classifier-Driven Diffusion Model (CLDDM), which trains a diffusion model by minimizing a per-timestep cross-entropy loss under class-label supervision, while achieving high-quality class-conditional generation. In other words, CLDDM establishes a unified framework that demonstrates the equivalence between classification and generation. This equivalence enables new training strategies for conditional diffusion models. In particular, we show that ``weakly supervised'' generation can be realized by leveraging established classification objectives from weakly supervised learning. Experimental results on a toy dataset and image benchmarks demonstrate both quantitative and qualitative equivalence between CLDDM and standard diffusion models, and further confirm that CLDDM supports conditional generation under weak supervision, such as learning with noisy labels and learning from label proportions.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 581
Loading