Abstract: While dialogue systems based on large language models have demonstrated basic memory storage capabilities, they face critical challenges in dynamic memory update mechanisms. Existing memory update systems often employ simple accumulation methods, leading to contradictory information in memory banks that severely impacts dialogue consistency. To address these issues, we propose the Memory Management Module (MMM) framework, which innovatively designs an external memory management system decoupled from LLMs to achieve precise memory maintenance. Unlike traditional parameter update methods, MMM innovatively designs a rule-engine-based memory operation protocol supporting dynamic creation, deletion, and modification of memory entries. Meanwhile, we trained a lightweight memory update model that reduces computational costs while ensuring performance. To validate system effectiveness, we constructed the Mem\_Dialogue-100 dataset: a multi-turn dialogue dataset with explicit state transition markers, where each dialogue preset multiple memory conflict events to simulate real interaction scenarios. Experiments show that the MMM framework improves dialogue consistency by 17.0\% and 12.1\% on Qwen-7B and GPT-4o respectively, while reducing time complexity compared to traditional methods. These findings provide new technical pathways for building dialogue systems with continuous learning capabilities.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: LLM, Dialogue system, LLM Memory
Contribution Types: NLP engineering experiment
Languages Studied: python
Submission Number: 8419
Loading