Entropic Memory: A Thermodynamics-Inspired Consolidation Mechanism for Lifelong Agent Learning

Published: 03 Mar 2026, Last Modified: 25 Apr 2026ICLR 2026 Workshop MemAgentsEveryoneRevisionsBibTeXCC BY 4.0
Keywords: memory consolidation, LLM agents, retrieval-augmented generation, free energy minimization, simulated annealing
TL;DR: We propose Entropic Memory, a consolidation mechanism for LLM agents that uses free energy minimization to retain useful memories while suppressing noisy ones, improving survival rate by 15% over greedy importance sampling at 50% noise.
Abstract: Large language model (LLM) agents often degrade over long interaction streams because memory accumulates noisy observations that reduce retrieval quality. We propose Entropic Memory, a two-tier memory consolidation mechanism that periodically transfers information from a hot working buffer into a cold long-term store. The method uses a free-energy objective to balance utility against embedding entropy, together with a temperature-controlled stochastic replacement rule. In the controlled Infinite Room environment under a fixed memory budget, Entropic Memory matches greedy importance sampling at 30\% noise ($SR \approx 0.29$) and improves survival rate from $0.24$ to $0.28$ at 50\% noise (+15\% relative). Overall, these results indicate that entropy-aware consolidation improves robustness to distractors in this controlled continual-memory setting.
Submission Number: 50
Loading