I-Diff: Isotropy-Based Regularization for Generation of Complex Data Distributions

ICLR 2026 Conference Submission19288 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Generative Models, Diffusion Models, Structure, Image Generation, Gaussian Noise
Abstract: Denoising Diffusion Probabilistic Models (DDPMs) have significantly advanced generative AI, achieving impressive results in high-quality image and data generation. However, enhancing consistency and fidelity remains a key challenge in the field. The conventional DDPM framework solely depends on the $L^2$ norm between additive and predicted noise to generate new data distributions. However, it does not explicitly impose structural information in the data distributions, limiting its ability to capture complex geometries (e.g., multimodality, asymmetry, anisotropy), which are commonly found in generation tasks. To address this limitation, we introduce I-Diff, an improved version of DDPM incorporating a carefully designed regularizer that effectively enables the model to encode structural information and capture the anisotropic nature, preserving the inherent structure of the data distribution. Notably, our method is model-agnostic and can be easily integrated into any DDPM variant. The proposed approach is validated through extensive experiments on DDPM and Latent Diffusion Models across multiple datasets. Empirical results demonstrate a 47% reduction in FID on CIFAR-100 dataset compared to the default DDPM, as well as significant improvements in fidelity (Density and Precision increase 27% and 16% in CIFAR-100 dataset respectively) across other tested datasets. These results highlight the effectiveness of our method in enhancing generative quality by capturing complex geometries in data distributions.
Primary Area: generative models
Submission Number: 19288
Loading