CoT-Planner: Chain-of-Thoughts as the Content Planner for Few-shot Table-to-Text Generation Reduces the Hallucinations from LLMs
Abstract: Few-shot table-to-text generation seeks to generate natural language descriptions for the given table in low-resource scenarios. Previous works mostly utilized Pre-trained Language Models (PLMs) even Large Language Models (LLMs) to generate fluent descriptions of the tables. However, they are prone to hallucinations that do not conform to the table. In this work, we propose CoT-Planner, a simple but efficient Chain-of-Thoughts-based approach that can be used to reduce the generation of hallucinations in the few-shot table-to-text generation. We first use a LLM (such as ChatGPT) to automatically generate ten intermediate content plans in the form of a Chain-of-Thoughts (CoT) for each table and corresponding description pair. Then, we refined the most accurate content plan for each sample and used the table and text pairs with the added content plan (CoT-Plan) as demonstrations for In-Context Learning (ICL). Both automatic and human evaluations on the numericNLG dataset show our method can effectively alleviate hallucinations, thereby improving factual consistency in few-shot table-to-text generation. The code and data can be accessed from https://github.com/FXLP/CoT-Planner.
External IDs:dblp:conf/ijcnn/LinBYXLLR25
Loading