Edge-preserving noise for diffusion models

19 Sept 2025 (modified: 26 Sept 2025)ICLR 2026 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: generative modeling, diffusion models, anisotropic noise
TL;DR: We introduce a novel structure-aware noise scheduler that generalizes isotropic diffusion models and allows models to explicitly learn geometric structures in the training data.
Abstract: Classical diffusion models typically rely on isotropic Gaussian noise, treating all regions uniformly and overlooking structural information that may be vital for high-quality generation. We introduce an edge-preserving diffusion process that generalizes isotropic models through a hybrid noise scheme. At its core is an edge-aware scheduler that transitions smoothly from edge-preserving to isotropic noise, allowing the model to capture fine structural details while generally maintaining global performance. To measure the impact of structure-aware noise on the generative process, we analyze and evaluate our edge-preserving process against isotropic models in both diffusion and flow-matching frameworks. Importantly, we show that existing isotropic models can be efficiently fine-tuned with edge-preserving noise, making our approach practical for adapting pre-trained systems. Beyond improvements in unconditional generation, it offers significant benefits in structure-guided tasks such as stroke-to-image synthesis, improving robustness, fidelity, and perceptual quality. Extensive evaluations (FID, KID, CLIP-score) show consistent improvements of up to 30%, highlighting edge-preserving noise as a simple yet powerful advance for generative diffusion, particularly in structure-guided settings.
Supplementary Material: zip
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2026/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 21780
Loading