Dense Associative Memory on the Bures-Wasserstein Space

ICLR 2026 Conference Submission15581 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Dense Associative Memory, Generative Modeling, Optimal Transport, Energy-Based Models, Storage and Retrieval
TL;DR: We extend associative memories from vectors to probability distributions using Wasserstein geometry, proving storage capacity guarantees and showing robust distributional retrieval in both synthetic and real-world tasks.
Abstract: Dense associative memories (DAMs) store and retrieve patterns via energy-functional fixed points, but existing models are limited to vector representations. We extend DAMs to probability distributions equipped with the 2-Wasserstein distance, focusing mainly on the Bures–Wasserstein class of Gaussian densities. Our framework defines a log-sum-exp energy over stored distributions and a retrieval dynamics aggregating optimal transport maps in a Gibbs-weighted manner. Stationary points correspond to self-consistent Wasserstein barycenters, generalizing classical DAM fixed points. We prove exponential storage capacity, provide quantitative retrieval guarantees under Wasserstein perturbations, and validate the model on synthetic and real-world distributional tasks. This work elevates associative memory from vectors to full distributions, bridging classical DAMs with modern generative modeling and enabling distributional storage and retrieval in memory-augmented learning.
Supplementary Material: zip
Primary Area: generative models
Submission Number: 15581
Loading