On Sparse Modern Hopfield Model

Published: 21 Sept 2023, Last Modified: 21 Dec 2023NeurIPS 2023 posterEveryoneRevisionsBibTeX
Keywords: Hopfield Models; Modern Hopfield Networks; Sparse Attention; Memory Networks
TL;DR: We introduce a sparse modern Hopfield model with memory-retrieval dynamics corresponding to the sparse-structured attention mechanism, enabling robust representation learning, fast convergence, and exponential memory capacity
Abstract: We introduce the sparse modern Hopfield model as a sparse extension of the modern Hopfield model. Like its dense counterpart, the sparse modern Hopfield model equips a memory-retrieval dynamics whose one-step approximation corresponds to the sparse attention mechanism. Theoretically, our key contribution is a principled derivation of a closed-form sparse Hopfield energy using the convex conjugate of the sparse entropic regularizer. Building upon this, we derive the sparse memory retrieval dynamics from the sparse energy function and show its one-step approximation is equivalent to the sparse-structured attention. Importantly, we provide a sparsity-dependent memory retrieval error bound which is provably tighter than its dense analog. The conditions for the benefits of sparsity to arise are therefore identified and discussed. In addition, we show that the sparse modern Hopfield model maintains the robust theoretical properties of its dense counterpart, including rapid fixed point convergence and exponential memory capacity. Empirically, we use both synthetic and real-world datasets to demonstrate that the sparse Hopfield model outperforms its dense counterpart in many situations.
Supplementary Material: pdf
Submission Number: 1748
Loading