Two Heads Are Better Than One: Exploiting Both Sequence and Graph Models in AMR-To-Text GenerationDownload PDF

Anonymous

16 Feb 2024ACL ARR 2024 February Blind SubmissionReaders: Everyone
Abstract: Abstract meaning representation (AMR) is a special semantic representation language that captures sentences' meaning with syntax-irrelevant graphs. AMR-to-text generation aims to generate text according to a given AMR graph and is helpful in various downstream NLP tasks. Existing AMR-to-text generation methods roughly fall into two categories, each with pros and cons. The sequence-to-sequence models, especially pretrained language models (PLMs), have good text generation ability but cannot cope well with the structural information of AMR graphs. The graph-to-sequence models utilize graph neural networks (GNNs), showcasing complementary strengths and limitations. Combining both methods could harness their strengths; yet, merging a GNN with a PLM is non-trivial. In this paper, we propose DualGen, a dual encoder-decoder model that integrates a specially designed GNN into a sequence-to-sequence PLM. We conduct extensive experiments, human evaluation, and a case study, finding that DualGen achieves the desired effect and yields state-of-the-art performance in the AMR-to-text generation task. We also show it outperforms the most potent general-purpose PLMs, LLaMA and GPT-4.
Paper Type: long
Research Area: Generation
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Publicly available software and/or pre-trained models
Languages Studied: English
0 Replies

Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview