Accelerating Hierarchical Associative Memory: A Deep Equilibrium Approach

Published: 27 Oct 2023, Last Modified: 26 Nov 2023AMHN23 PosterEveryoneRevisionsBibTeX
Keywords: Hierarchical Associative Memory, Deep Equilibrium Models, even-odd splitting
TL;DR: Hierarchical Associative Memory models can be made much faster by casting them as Deep Equilibrium Models and alternating optimization between the even and the odd layers.
Abstract: Hierarchical Associative Memory models have recently been proposed as a versatile extension of continuous Hopfield networks. In order to facilitate future research on such models, especially at scale, we focus on increasing their simulation efficiency on digital hardware. In particular, we propose two strategies to speed up memory retrieval in these models, which corresponds to their use at inference, but is equally important during training. First, we show how they can be cast as Deep Equilibrium Models, which allows using faster and more stable solvers. Second, inspired by earlier work, we show that alternating optimization of the even and odd layers accelerates memory retrieval by a factor close to two. Combined, these two techniques allow for a much faster energy minimization, as shown in our proof-of-concept experimental results. The code is available at https://github.com/cgoemaere/hamdeq.
Submission Number: 34
Loading