Few-shot Query-oriented Summarization with Prefix-mergingDownload PDF

Anonymous

16 Feb 2022 (modified: 05 May 2023)ACL ARR 2022 February Blind SubmissionReaders: Everyone
Abstract: Query-oriented summarization has been considered as an important extension for text summarization. It aims to generate a concise highlight for a given query. Different from text summarization, query-oriented summarization has long been plagued by the problem of lacking high-quality large-scale datasets. In this paper, we investigate the idea that whether we can integrate and transfer the knowledge of text summarization and question answering to assist the few-shot learning in query-oriented summarization. Meanwhile, we draw inspiration from prefix-tuning, whose prefix is considered as containing task-specific knowledge. Here, we propose prefix-merging, a prefix-based pretraining strategy for few-shot learning in natural language generation tasks. It allows us to control and integrate the task knowledge across multiple basic tasks through a proper prefix design and apply the merged prefix to the downstream task. With only a small amount of trainable parameters, prefix-merging outperforms fine-tuning on the query-oriented summarization task. We further discuss the influence of different prefix designs and propose a visualized explanation for how prefix-merging works.
Paper Type: long
0 Replies

Loading