Physics Informed Distillation for Diffusion Models

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Difffusion Models, Knowledge Distillation, Physics informed neural networks
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We introduce a novel Distillation approach for Diffusion Model heavily inspired by Physics Informed Neural Networks(PINNs) enabling single-step image generation.
Abstract: Diffusion models have recently emerged as a potent tool in generative modeling, although their inherent iterative nature often results in sluggish image generation due to the requirement for multiple model evaluations. Recent progress has unveiled the intrinsic link between diffusion models and Probability Flow Ordinary Differential Equations (ODEs), thus enabling us to conceptualize diffusion models as ODE systems. Simultaneously, Physics Informed Neural Networks (PINNs) have substantiated their effectiveness in solving intricate differential equations through implicit modeling of their solutions. Building upon these foundational insights, we introduce Physics Informed Distillation (PID), a novel approach that employs a student model to represent the solution of the ODE system corresponding to the teacher diffusion model, akin to the principles employed in PINNs. Our approach demonstrates remarkable results, such as achieving an FID score of 3.92 on CIFAR-10 for single-step image generation. Additionally, we establish the stability of our method under conditions involving a sufficiently high discretization number, paralleling observations found in the PINN literature, thus highlighting its potential as a streamlined single-step distillation approach without the need for additional methodology-specific hyperparameters. The code will be made available upon acceptance.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7109
Loading