Keywords: Random feature architecture, diffusion models, generalization theorem
TL;DR: A novel adaptive random feature architecture for diffusion models with generalization bounds
Abstract: Diffusion probabilistic models have been successfully used to generate high resolution data from noise. However, limited interpretability and computationally expensive implementation renders most existing diffusion models unsuitable for applications where the goal is to achieve theoretically accurate results with a fair trade-off on the quality of generated samples.
In this work, we propose an adaptive random feature architecture for training a diffusion model that can generate samples from the input distribution using limited parameters, and theoretically quantify the generalization bounds.
Our results demonstrate the possibility of building interpretable diffusion model architectures. We validate the obtained theory using numerical results, which perform better (or comparably) to an U-Net architecture having same number of trainable parameters.
Email Sharing: We authorize the sharing of all author emails with Program Chairs.
Data Release: We authorize the release of our submission and author names to the public in the event of acceptance.
Submission Number: 13
Loading