A Generalized Framework of Sequence Generation with Application to Undirected Sequence Models

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • Keywords: nlp, sequence modeling, natural language generation, machine translation, BERT, Sesame Street
  • TL;DR: We unify several language generation paradigms (monotonic autoregressive, non-autoregressive, etc.) in a single framework, and use the framework to do machine translation with undirected sequence models.
  • Abstract: Undirected neural sequence models such as BERT (Devlin et al., 2019) have received renewed interest due to their success on discriminative natural language understanding tasks such as question-answering and natural language inference. The problem of generating sequences directly from these models has received relatively little attention, in part because generating from such models departs significantly from the conventional approach of monotonic generation in directed sequence models. We investigate this problem by first proposing a generalized model of sequence generation that unifies decoding in directed and undirected models. The proposed framework models the process of generation rather than a resulting sequence, and under this framework, we derive various neural sequence models as special cases, such as autoregressive, semi-autoregressive, and refinement-based non-autoregressive models. This unification enables us to adapt decoding algorithms originally developed for directed sequence models to undirected models. We demonstrate this by evaluating various decoding strategies for a cross-lingual masked translation model (Lample and Conneau, 2019). Our experiments show that generation from undirected sequence models, under our framework, is competitive with the state of the art on WMT'14 English-German translation. We also demonstrate that the proposed approach enables constant-time translation with similar performance to linear-time translation from the same model by rescoring hypotheses with an autoregressive model.
  • Original Pdf:  pdf
0 Replies