Correlated dense associative memories

21 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: general machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: inhibition, dense associative memory, Hopfield network, community detection, clustering, navigation
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: We introduce Correlated Dense Associative Memory (CDAM), using graph-correlated, continuous-valued, and structured memories with anti-Hebbian learning to demonstrate hierarchical-segmentation, oriented recall, and stable temporal sequence memory.
Abstract: Associative memory networks store memory patterns by forming dynamic attractors around chosen states of neurons. These attractors do not, however, need to be fixed points or single memory patterns. By correlating attractors, we may represent temporally or spatially related sequences or groups of stimuli. By further modulating these correlations using inhibitory (anti-Hebbian) learning rules, we show how such sequences and groups may be hierarchically-segmented at multiple scales. In combination with the dramatically-increased storage capacity of dense associative memory networks (also known as modern Hopfield networks) and their connection to Transformers, our results have implications for both machine learning and neurobiology. We demonstrate this by applying our networks, dubbed *Correlated Dense Associative Memory (CDAM)*, to model multi-scale representations of community structures in graphs, oriented recall in a symmetric connection regime, and temporal sequences with distractors in an asymmetric connection regime.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3576
Loading