Diverse Beam Search: Decoding Diverse Solutions from Neural Sequence ModelsDownload PDF

20 Apr 2024 (modified: 22 Oct 2023)Submitted to ICLR 2017Readers: Everyone
Abstract: Neural sequence models are widely used to model time-series data. Equally ubiquitous is the usage of beam search (BS) as an approximate inference algorithm to decode output sequences from these models. BS explores the search space in a greedy left-right fashion retaining only the top B candidates. This tends to result in sequences that differ only slightly from each other. Producing lists of nearly identical sequences is not only computationally wasteful but also typically fails to capture the inherent ambiguity of complex AI tasks. To overcome this problem, we propose Diverse Beam Search (DBS), an alternative to BS that decodes a list of diverse outputs by optimizing a diversity-augmented objective. We observe that our method not only improved diversity but also finds better top 1 solutions by controlling for the exploration and exploitation of the search space. Moreover, these gains are achieved with minimal computational or memory overhead com- pared to beam search. To demonstrate the broad applicability of our method, we present results on image captioning, machine translation, conversation and visual question generation using both standard quantitative metrics and qualitative human studies. We find that our method consistently outperforms BS and previously proposed techniques for diverse decoding from neural sequence models.
TL;DR: We introduce a novel, diversity promoting beam search algorithm that results in significantly improved diversity between decoded sequences as evaluated on multiple sequence generation tasks.
Conflicts: vt.edu, indiana.edu
Keywords: Deep learning, Computer vision, Natural language processing
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 10 code implementations](https://www.catalyzex.com/paper/arxiv:1610.02424/code)
16 Replies

Loading