Track: long paper (up to 8 pages)
Keywords: Denoising Diffusion Probabilistic Models, Information Bottleneck Principle, Symmetry, Image Generation
TL;DR: Symmetry Is All You Need
Abstract: Denoising Diffusion Probabilistic Models (DDPMs) typically rely on zero-mean white Gaussian noise during both training and sample generation. In this paper, we demonstrate that image generation in pre-trained DDPMs is not restricted to Gaussian noise; any zero-mean, symmetrically distributed noise or signal can be equally effective. We present theoretical and empirical evidence that symmetry, rather than the specific distribution type, is a sufficient condition for successful image generation using a pre-trained DDPM. Our findings enable DDPMs to operate more flexibly on resource-constrained devices by utilizing alternative noise or signal types, making them suitable for applications such as semantic communication, where symmetrical noise and signals are prevalent. This work establishes new avenues for efficient and adaptable generative models in real-world environments.
Submission Number: 21
Loading