Modern Hopfield Networks meet Encoded Neural Representations - Addressing Practical Considerations

Published: 10 Oct 2024, Last Modified: 30 Oct 2024UniRepsEveryoneRevisionsBibTeXCC BY 4.0
Supplementary Material: pdf
Track: Proceedings Track
Keywords: Hetra associative Hopfield networks, self attention, hopfield networks, encoded hopfield networks, kernel memory networks
TL;DR: We propose Hopfield Encoding Networks (HEN), which enhances pattern separation by encoding input representations into a latent space before storage. This method delays the onset of metastability and significantly increases storage capacity.
Abstract: Content-addressable memories such as Modern Hopfield Networks (MHN) have been studied as mathematical models of the auto-association and storage/retrieval in the human declarative memory, yet their practical use for large-scale content storage faces challenges. Chief among them is the occurrence of meta-stable states, particularly when handling large amounts of high dimensional content. This paper introduces Hopfield Encoding Networks (HEN), a framework that integrates encoded neural representations into MHNs to improve pattern separability and reduce meta-stable states. We show that HEN can also be used for retrieval in the context of hetero association of images with natural language queries, thus removing the limitation of requiring access to partial content in the same domain. Experimental results demonstrate substantial reduction in meta-stable states and increased storage capacity while still enabling perfect recall of a significantly larger number of inputs advancing the practical utility of associative memory networks for real-world tasks.
Submission Number: 42
Loading