Keywords: Associative Memory, Hopfield network, exponential storage capacity, self-attention, transformer, biological plausibility
TL;DR: A new type of self-attention layer which is biologically plausible, and has exponential storage capacity
Abstract: Recent developments have sought to overcome the inherent limitations of traditional associative memory models, like Hopfield networks, where storage capacity scales linearly with input dimension.
In this paper, we present a new extension of Hopfield networks that grants precise control over inter-neuron interactions while allowing control of the level of connectivity within the network. This versatile framework encompasses a variety of designs, including classical Hopfield networks, models with polynomial activation functions, and simplicial Hopfield networks as particular cases. Remarkably, a specific instance of our construction, resulting in a new self-attention mechanism, is characterized by quasi-exponential storage capacity and a sparse network structure, aligning with biological plausibility.
Submission Number: 6
Loading