Intrinsic Dense Associative Memory on Riemannian Manifolds

Published: 03 Mar 2026, Last Modified: 06 Mar 2026NFAM 2026 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dense Associative Memory, Riemannian Manifolds, Intrinsic Learning, Geodesic Energy, Manifold Attention
Abstract: We propose a novel Dense Associative Memory (DenseAM) framework defined intrinsically on a compact Riemannian manifold $\mathcal{M}$, enabling associative memory for manifold-valued data without Euclidean embedding. We introduce two natural geometric extensions of DenseAM on the manifold: (i) Volume-Corrected Geodesic energy (VC-Geodesic energy): a manifold-KDE energy obtained by incorporating the Riemannian volume density correction term, and (ii) Geodesic energy: a purely geodesic energy obtained by removing the correction term. We show that these two formulations exhibit fundamentally different behaviors. The geodesic energy admits exact memorization for finite inverse temperature $\beta$, achieves exponential storage capacity in the intrinsic dimension $m=\dim(\mathcal{M})$, and generates abundant emergent memories characterized as local Fréchet means. In contrast, the VC-Geodesic energy introduces a curvature-dependent bias that can destroy exact finite-$\beta$ memorization, particularly on positively curved manifolds. We further derive intrinsic gradient-based inference dynamics expressed via Riemannian exponential and logarithmic maps, leading to a manifold attention mechanism. Our theories are also supported by preliminary simulations of Riemannian manifold data for statistical inference such as classification and regression.
Submission Number: 29
Loading