CoT-Planner: Chain-of-Thoughts as the Content Planner for Few-shot Table-to-Text Generation Reduces the Hallucinations from LLMs

ACL ARR 2024 June Submission287 Authors

09 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Few-shot table-to-text generation seeks to generate natural language descriptions for the given table in low-resource scenarios. Previous works mostly utilized Pre-trained Language Models (PLMs) and even Large Language Models (LLMs) to generate fluent descriptions of the tables. However, they are prone to hallucinations that do not conform to the table. In this work, we propose CoT-Planner, a simple but efficient Chain-of-Thoughts-based approach that can be used to reduce the generation of hallucinations in the few-shot table-to-text generation. We first use a large language model (such as ChatGPT) to automatically generate ten intermediate content plans in the form of a Chain-of-Thoughts (CoT) for each table and corresponding description pair. Then, we refined the most accurate content plan for each sample and used the table and text pairs with the added content plan (CoT-Plan) as demonstrations for In-Context Learning (ICL). Both automatic and human evaluations on the numericNLG dataset show our method can effectively alleviate hallucinations, thereby improving factual consistency in few-shot table-to-text generation. The code and data will be released upon acceptance.
Paper Type: Long
Research Area: Generation
Research Area Keywords: data-to-text generation; few-shot generation
Contribution Types: Approaches to low-resource settings
Languages Studied: English
Submission Number: 287
Loading