Annotating Hallucinations in Data-to-Text NLG: Implementing a Logical Framework in Different Domains

ACL ARR 2025 May Submission3177 Authors

19 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Hallucinations are a persistent challenge in natural language generation, including data-to-text. van Deemter (2024) introduced a unifying framework based on logical consequence, aiming to categorize all hallucinations through a single formal relation. We examine whether human annotators and large language models are able to apply the framework, in two data-to-text domains. Results suggest that the framework is applicable, but they also show up significant domain-dependent variation and discrepancies between human and model judgments. We also uncover several challenges that inform future work on hallucination annotation.
Paper Type: Long
Research Area: Generation
Research Area Keywords: data-to-text generation, domain adaptation, human evaluation, automatic evaluation
Languages Studied: English
Submission Number: 3177
Loading