Generating Long-form Story Using Dynamic Hierarchical Outlining with Memory-Enhancement

ACL ARR 2024 June Submission92 Authors

05 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Long-form story generation task aims to produce coherent and sufficiently lengthy text, essential for applications such as novel writing and interactive storytelling. However, existing methods, including LLMs, rely on rigid outlines or lack macro-level planning, it difficult to achieve both contextual consistency and coherent plot development in long-form story generation. To address these issues, we propose Dynamic Hierarchical Outlining with Memory-Enhancement long-form story generation method, named DOME, to generate the long-form story with coherent content and plot. Specifically, the Dynamic Hierarchical Outline (DHO) mechanism incorporates the novel writing theory into outline planning and fuses the plan and writing stages together, improving the coherence of the plot by ensuring the plot completeness and fluency of story development. Additionally, a Memory-Enhancement Module (MEM) based on temporal knowledge graphs is introduced to store and access generated content, reducing contextual conflicts and improving story coherence. Finally, we propose a Temporal Conflict Analyzer leveraging temporal knowledge graphs to evaluate contextual consistency. Experiments demonstrate that DOME significantly improves the fluency, coherence, and overall quality of generated long stories compared to state-of-the-art methods
Paper Type: Long
Research Area: NLP Applications
Research Area Keywords: Large Language Models, Long-form Story Generation
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 92
Loading