Diffusion Probabilistic Models Generalize when They Fail to Memorize

Published: 19 Jun 2023, Last Modified: 28 Jul 20231st SPIGM @ ICML PosterEveryoneRevisionsBibTeX
Keywords: Diffusion models; Diffusion probabilistic models; Data replication; Memorization; Generalization
TL;DR: Generalization of diffusion models is when they are unable to fully memorize the train data.
Abstract: In this work, we study the training of diffusion probabilistic models through a series of hypotheses and carefully designed experiments. We call our key finding the memorization-generalization dichotomy, and it asserts that generalization and memorization are mutually exclusive phenomena. This contrasts with the modern wisdom of supervised learning that deep neural networks exhibit "benign" overfitting and generalize well despite overfitting the data.
Submission Number: 87