A Well-Composed Text is Half Done! Semantic Composition Sampling for Diverse Conditional GenerationDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: We propose Composition Sampling, a simple but effective method to generate higher quality diverse outputs for conditional generation tasks, compared to previous stochastic decoding strategies. It builds on recently proposed planning-based neural generation models that are trained to first create a composition of the output using an entity chain and then continue to generate conditioned on the entity chain and the input \cite{frost}. Our approach avoids text degeneration by first sampling a composition in the form of an entity chain and then using beam search to generate the best possible text grounded to the entity chain. Experiments on CNN/DailyMail and XSum using a variety of automatic metrics and human-based evaluation demonstrate that Composition Sampling is currently the best available decoding strategy for generating diverse meaningful summaries. We further outperform state-of-the-art approaches for question generation in terms of BLEU.
0 Replies

Loading