Modern Hopfield Networks with Continuous-Time Memories

Published: 05 Mar 2025, Last Modified: 20 Apr 2025NFAM 2025 PosterEveryoneRevisionsBibTeXCC BY 4.0
Track: long paper (up to 5 pages)
Keywords: Continuous attention, Hopfield Networks, Efficiency, Continuous memories
Abstract: Recent research has established a connection between modern Hopfield networks (HNs) and transformer attention heads, with guarantees of exponential storage capacity. However, these models still face challenges scaling storage efficiently. Inspired by psychological theories of continuous neural resource allocation in working memory, we propose an approach that compresses large discrete Hopfield memories into smaller, continuous-time memories. Leveraging continuous attention, our new energy function modifies the update rule of HNs, replacing the traditional softmax-based probability mass function with a probability density, over the continuous memory. This formulation aligns with modern perspectives on human executive function, offering a principled link between attractor dynamics in working memory and resource-efficient memory allocation. Our framework maintains competitive performance with HNs while leveraging a compressed memory, reducing computational costs across synthetic and video datasets.
Submission Number: 7
Loading