Discovering Non-Monotonic Autoregressive Ordering for Text Generation Models using Sinkhorn Distributions

Anonymous

Published: 28 Mar 2022, Last Modified: 05 May 2023BT@ICLR2022Readers: Everyone
Abstract: In this blog, we discuss an important but less researched topic - Discovering non-monotonic orderings for guiding models to obtain high-quality texts. We specifically discuss the model proposed in the ICLR 2021 paper by Li 2021. This model uses Gumbel-Sinkhorn distributions to assist a decoder model by providing good-quality generation orders during training. The trained models help in generating high-quality outputs for four important NLG tasks: (a) Image Captioning (b) Code Generation (c) Text Summarization and (d) Machine Translation. Interestingly, the model behavior replicates human behavior in some sense - Considering what to write about, before figuring out how to write about it.
Submission Full: zip
Blogpost Url: yml
ICLR Paper: https://openreview.net/forum?id=jP1vTH3inC
3 Replies

Loading