SSP: Story-Space Prompting Improves the Reader Immersion in Long Story Generation

ACL ARR 2024 June Submission4670 Authors

16 Jun 2024 (modified: 21 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Generating long-form stories with neural network models, even the large language models (LLMs), e.g., GPT, has always been criticized for lacking interestingness and coherence, thus greatly diminishing the reader's sense of immersion. In this paper, we present a novel "story space" prompting (SSP) solution, which provides a coherent and consistent background to support long-term storytelling. Specifically, we first define the story space intricately connected to the given story premise. Then, Our framework systematically generates the story space by progressively constructing it from an abstract representation to a more informative and detailed one. Empirically, we implement our plug-in method upon an existing advanced story generation framework (Yang et al., 2023) and evaluate its impact on both interestingness and coherence. Our findings emphasize the significance of our SSP in enhancing reader enjoyment and immersion, contributing to advancements in long-form story generation.
Paper Type: Long
Research Area: Generation
Research Area Keywords: text-to-text generation
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 4670
Loading