Two Heads Are Better Than One: Exploiting Both Sequence and Graph Models in AMR-To-Text Generation

23 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: generative models
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: graph-to-text generation, abstract mearning representation, dual-encoder
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: A dual encoder-decoder model for AMR graph-to-text generation, featuring a graph neural network encoder along with pretrained language models.
Abstract: Abstract meaning representation (AMR) is a special semantic representation language, which can capture the core meaning of a sentence with a syntax-irrelevant graph. AMR-to-text generation, which aims to generate a sentence according to a given AMR graph, is a well-studied task and has shown its helpfulness in various other NLP tasks. Existing AMR-to-text generation methods can be roughly divided into two categories, while either has its own advantages and disadvantages. The first one adopts a sequence-to-sequence model, especially a pretrained language model (PLM). It has good text generation ability but cannot cope with the structural information of AMR graphs well. The second category of method is based on graph neural networks (GNNs), whose advantages and disadvantages are exactly the opposite. To combine the strengths of the two kinds of models, in this paper, we propose a dual encoder-decoder model named \modelName, which integrates a specially designed GNN into a pre-trained sequence-to-sequence model. We conduct extensive experiments as well as human evaluation and a case study, finding that it achieves the desired effect and yields state-of-the-art performance in the AMR-to-text generation task. We also demonstrate that it outperforms the most powerful general-purpose PLM GPT-4.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
Supplementary Material: zip
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 7480
Loading