Resurrecting Submodularity for Neural Text GenerationDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Withdrawn SubmissionReaders: Everyone
Keywords: submodularity, text generation, attention
Abstract: Submodularity is a desirable property for a variety of objectives in content selection where the current neural encoder-decoder framework is inadequate. We define a class of novel attention mechanisms with submodular functions and in turn, prove the submodularity of the effective neural coverage. The resulting attention module offers an architecturally simple and empirically effective method to improve the coverage of neural text generation. We run experiments on three directed text generation tasks with different levels of recovering rate, across two modalities, three different neural model architectures and two training strategy variations. The results and analyses demonstrate that our method generalizes well across these settings, produces texts of good quality, outperforms comparable baselines and achieves state-of-the-art performance.
One-sentence Summary: Theoretically grounded and empirically effective attention mechanism exploiting submodularity
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Reviewed Version (pdf): https://openreview.net/references/pdf?id=R8PjLlAW-p
5 Replies

Loading