A New Type of Associative Memory Network with Exponential Storage Capacity

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Supplementary Material: zip
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Associative memory, dense Hopfield networks, self-attention
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: Hopfield network with exponential robust storage capacity, fast retrieval of memories, and biologically-plausible design (i.e sparse one-to-one connectivity graph)
Abstract: Recent developments have sought to overcome the inherent limitations of traditional associative memory models, like Hopfield networks, where storage capacity scales linearly with input dimension. In this paper, we present a new extension of Hopfield networks that grants precise control over inter-neuron interactions while allowing control of the level of connectivity within the network. This versatile framework encompasses a variety of designs, including classical Hopfield networks, models with polynomial activation functions, and simplicial Hopfield networks as particular cases. Remarkably, a specific instance of our construction, resulting in a new self-attention mechanism, is characterized by quasi-exponential storage capacity and a sparse network structure, aligning with biological plausibility. To our knowledge, our proposed construction introduces the first biologically-plausible associative memory model with exponential storage capacity. Furthermore, the resulting model admits a very efficient implementation via vectorization; therefore, it can fully exploit modern numerical computation hardware like GPUs. This work not only advances the theoretical foundations of associative memory but also provides insights into the development of neurobiologically inspired associative memory systems with unprecedented capabilities.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7323
Loading