Provably Optimal Memory Capacity for Modern Hopfield Models: Transformer-Compatible Dense Associative Memories as Spherical Codes
Keywords: Modern Hopfield Models, Dense Associative Memory, Transformer, Transformer Representation Learning, Memory Capacity, Foundation Model
TL;DR: We enhance Modern Hopfield Networks with a kernelized approach to improve memory storage and retrieval, achieving optimal capacity and efficiency in sub-linear time.
Abstract: We study the optimal memorization capacity of modern Hopfield models and Kernelized Hopfield Models (KHMs), a transformer-compatible class of Dense Associative Memories.
We present a tight analysis by establishing a connection between the memory configuration of KHMs and spherical codes from information theory.
Specifically, we treat the stored memory set as a specialized spherical code.
This enables us to cast the memorization problem in KHMs into a point arrangement problem on a hypersphere.
We show that the optimal capacity of KHMs occurs when the feature space allows memories to form an optimal spherical code.
This unique perspective leads to:
1. An analysis of how KHMs achieve optimal memory capacity, and identify corresponding necessary conditions.
Importantly, we establish an upper capacity bound that matches the well-known exponential lower bound in the literature.
This provides the first tight and optimal asymptotic memory capacity for modern Hopfield models.
2. A sub-linear time algorithm $\mathtt{U}\text{-}\mathtt{Hop}$+ to reach KHMs' optimal capacity.
3. An analysis of the scaling behavior of the required feature dimension relative to the number of stored memories.
These efforts improve both the retrieval capability of KHMs and the representation learning of corresponding transformers.
Experimentally, we provide thorough numerical results to back up theoretical findings.
Supplementary Material: zip
Primary Area: Neuroscience and cognitive science (neural coding, brain-computer interfaces)
Submission Number: 1731
Loading