Continuation is a Sub-Task of Fill in the Blank: Why Not Train for Both?Download PDF


16 Jun 2021 (modified: 05 May 2023)ACL ARR 2021 Jun Blind SubmissionReaders: Everyone
Abstract: The task of inserting text into a specified position in a passage, known as fill in the blank, is useful for a variety of applications where writers interact with a natural language generation (NLG) system to craft text. However, NLG research has mostly focused on continuation models that append text to the end of a passage. Since continuation is in fact a sub-task of fill in the blank, one where the blank is placed at the sequence's end, we propose the training of a single model which can effectively handle both these tasks. The result is improved efficiency---as only one model needs to be maintained---without any negative impact on performance at either task.
Software: zip
Data: zip
0 Replies