KG + Narrative > LLM: Integrating a Commonsense Knowledge Graph with Children's Storybook NarrativesDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
Abstract: Structured knowledge such as Knowledge Graph (KG) has long been utilized by humans in real-world scenarios (e.g., clinical diagnosis and children’s education) together with free-form narratives. Despite exceptional text generation ability, whether Large Language Models (LLMs) are adaptative to and well-performed in these specialized real-world tasks has been overlooked. In the LLM era, is structured knowledge still useful for domain-specific tasks? In this paper, we propose a new interactive storytelling task grounded in real-world needs: preschool teachers and parents educate children on real-world knowledge through questioning-answering (QA) beyond story narratives during storytelling. For this task, we 1) design an annotation framework to leverage established commonsense KG to enrich narrative QA, and 2) construct an expert-annotated FairytaleCQA dataset (5, 868 QA-pairs) with external commonsense knowledge for evaluation. Our experiments show that: 1) expert-annotated structured knowledge can enhance LLMs’ (e.g., GPT-4) performance; 2) our designed QAG pipeline can support a small fine-tuned LM to consistently outperform large LLMs on FairytaleCQA.
Paper Type: long
Research Area: Resources and Evaluation
Contribution Types: NLP engineering experiment, Data resources
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview