Keywords: Diffusion Model, Anisotropic Diffusion, Adaptive Scheduler
TL;DR: learning matrix-valued noise schedules for anisotropic diffusion models, and efficient training/inference.
Abstract: We study anisotropic diffusion for generative modeling by replacing the scalar noise schedule with a matrix‑valued path $M_t$ that allocates noise (and denoising effort) across subspaces. We introduce a trajectory‑level objective that jointly trains the score network and \emph{learns} $M_t(\theta)$; in the isotropic case, it recovers standard score matching, making schedule learning equivalent to choosing the weight over noise levels. We further derive an efficient estimator for $\partial_\theta \nabla \log p_t$ that enables efficient optimization of $M_t$. For inference, we develop an anisotropic reverse‑ODE sampler based on a second‑order Heun update with a closed‑form step, and we learn a scalar time-transform $r(t;\gamma)$ that targets discretization error. Across CIFAR‑10, AFHQv2, and FFHQ, our method matches EDM overall and substantially improve few-step generation. Together, these pieces yield a practical, trajectory‑optimal recipe for anisotropic diffusion.
Code is available at \footnote{anonymous.4open.science/r/anisotropic-diffusion-paper-8738}.
Primary Area: generative models
Submission Number: 20950
Loading