A Well-Composed Text is Half Done! Semantic Composition Sampling for Diverse Conditional GenerationDownload PDF

Anonymous

16 Jan 2022 (modified: 05 May 2023)ACL ARR 2022 January Blind SubmissionReaders: Everyone
Abstract: We propose Composition Sampling, a simple but effective method to generate higher quality diverse outputs for conditional generation tasks, compared to previous stochastic decoding strategies. It builds on recently proposed planning-based neural generation models that are trained to first create a composition of the output using an entity chain and then continue to generate conditioned on the entity chain and the input (Narayan et al, 2021). Our approach avoids text degeneration by first sampling a composition in the form of an entity chain and then using beam search to generate the best possible text grounded to the entity chain. Experiments on summarization (CNN/DailyMail and XSum) and SQuAD question generation tasks, using a wide variety of automatic metrics and human-based evaluation, demonstrate that Composition Sampling is currently the best available decoding strategy for generating diverse meaningful outputs. We further introduce a novel automatic measure for jointly evaluating diversity and faithfulness in summaries.
Paper Type: long
0 Replies

Loading