Physics Informed Distillation for Diffusion Models

Published: 09 Jun 2024, Last Modified: 09 Jun 2024Accepted by TMLREveryoneRevisionsBibTeX
Abstract: Diffusion models have recently emerged as a potent tool in generative modeling. However, their inherent iterative nature often results in sluggish image generation due to the requirement for multiple model evaluations. Recent progress has unveiled the intrinsic link between diffusion models and Probability Flow Ordinary Differential Equations (ODEs), thus enabling us to conceptualize diffusion models as ODE systems. Simultaneously, Physics Informed Neural Networks (PINNs) have substantiated their effectiveness in solving intricate differential equations through implicit modeling of their solutions. Building upon these foundational insights, we introduce Physics Informed Distillation (PID), which employs a student model to represent the solution of the ODE system corresponding to the teacher diffusion model, akin to the principles employed in PINNs. Through experiments on CIFAR 10 and ImageNet 64x64, we observe that PID achieves performance comparable to recent distillation methods. Notably, it demonstrates predictable trends concerning method-specific hyperparameters and eliminates the need for synthetic dataset generation during the distillation process. Both of which contribute to its easy-to-use nature as a distillation approach for Diffusion Models.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: Added Author List\ Included link to codebase\ Merged theoretical contribution into the first item\ Added remark on the reliance of CD in discretization parameters in section 6.6\ Added clarification on LPIPS as an empirical design choice\ Added DSNO time comparisons\ Reorganized Ablation Study Sections
Assigned Action Editor: ~Valentin_De_Bortoli1
Submission Number: 2090
Loading