Track: long paper (up to 5 pages)
Keywords: Hopfield models, Dense Associative Memory models. Diffusion models, Energy-based models
TL;DR: This paper provides a novel perspective on the memorization-generalization phenomenon in diffusion models (DMs) via the lens of Hopfield models, which supports the view of DMs as associative memory models operating above the critical memory capacity.
Abstract: Hopfield networks are associative memory (AM) systems, designed for storing and retrieving patterns as local minima of an energy landscape. In the classical Hopfield model, an interesting phenomenon occurs when the amount of training data reaches its critical memory load —spurious states, or unintended stable points, emerge at the end of the retrieval dynamics. These particular states often appear as mixtures of the stored patterns, leading to incorrect recall. In this work, we examine diffusion models, commonly used in generative modeling, from the perspective of AMs. The training phase of diffusion model is conceptualized as memory encoding (training data is stored in the memory). The generation phase is viewed as an attempt of memory retrieval. In the small data regime the diffusion model exhibits a strong memorization phase, where the network creates distinct basins of attraction around each sample in the training set, akin to the Hopfield model below the critical memory load. In the large data regime, a different phase appears where an increase in the size of the training set fosters the creation of new attractor states that correspond to manifolds of the generated samples. Spurious states appear at the boundary of this transition and correspond to emergent attractor states, which are absent in the training set, but at the same time still have distinct basins of attraction around them. Our findings provide a novel perspective on the memorization-generalization phenomenon in diffusion models via the lens of AMs, which supports the view of diffusion models as AMs operating above the critical memory load.
Submission Number: 8
Loading