Mobius-Cycle: Multi-round Reverse Reconstruction of Graphs boosts the Consistency of Graph-to-Text Generation using LLMs
Keywords: Graph-to-Text Generation, Reverse Reconstruction, Consistency
Abstract: Graph-to-text generation (G2T) is a branch of Natural Language Generation (NLG) that aims to generate a textual output that can accurately and fluently describe the given graph. Recently, large language models (LLMs) have demonstrated remarkable in-context learning (ICL) capabilities in various NLG tasks. However, LLMs are prone to producing hallucinations, resulting in inconsistent content between the generated text and the original graph. In this paper, we are the first to comprehensively explore the performance of various LLMs in the G2T tasks on both KG and AMR data. For high-consistency G2T, we propose the LLM-based Mobius-Cycle (MoC) framework, a non-training inference-time scaling method that utilizes multi-round reverse reconstruction of graphs to enhance consistency. In each round, the LLMs perform three steps in sequence: reverse reconstruction, consistency judgement, and graph-to-text generation. We define the new metric, RRGScore, for LLM-as-a-Judge to measure the consistency between the reverse reconstructed graph (RRG) and the original graph. We also define a new metric, JCI (Judgement Confidence Index), to measure the confidence level of the LLMs during the consistency judgement process. We conduct extensive automatic evaluations and LLM-as-a-Judge evaluations on both KG and AMR datasets to assess the consistency of various LLMs. The experimental results indicate that our MoC framework can effectively improve the consistency of LLMs in G2T tasks and has good interpretability based on the consistency judgement. The code and data will be released upon acceptance.
Paper Type: Long
Research Area: Natural Language Generation
Research Area Keywords: Generation, Data-to-text generation, Reverse Reconstruction
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: English
Submission Number: 1582
Loading