A neural network with key-value episodic memory retrieves and organizes memories based on causal event structures

Hayoung Song, Qihong Lu, Tan T Nguyen, Janice Chen, Yuan Chang Leong, Monica D Rosenberg, ShiNung Ching, Jeffrey M Zacks

Published: 05 Sept 2025, Last Modified: 12 Apr 2026CrossrefEveryoneRevisionsCC BY-SA 4.0
Abstract: Humans reflect on memories to make sense of ongoing events. Past work has shown that people retrieve causally related memories during comprehension, but the mechanisms underlying this process remain unclear. Here, we used a recurrent neural network augmented with a key-value episodic memory buffer and trained it to predict upcoming scenes while watching a television episode. At each time step, the model transformed the current scene into a value representing memory content and a key representing memory address, both stored as episodic memory. The model retrieved selective past values by applying self-attention over stored keys and integrated these memories with the current scene representation to generate predictions. The model retrieved memories similar to those retrieved by human participants watching the same episode during fMRI. Importantly, this similarity disappeared when causal relationships between events were controlled for. The model also represented causally related events with similar patterns, similar to how the human brain represents these events. These findings suggest that using two distinct memory representations allows the model to retrieve memories and organize events based on causal relationships, beyond semantic or perceptual similarities. Together, this work proposes a key-value episodic memory system as a candidate computational mechanism for how humans retrieve causally related memories to comprehend naturalistic events.
Loading