CorDial: Coarse-to-fine Abstractive Dialogue Summarization with Controllable GranularityDownload PDF

28 Sept 2020 (modified: 05 May 2023)ICLR 2021 Conference Blind SubmissionReaders: Everyone
Keywords: dialogue, summarization, controllable generation, natural language processing
Abstract: Dialogue summarization is challenging due to its multi-speaker standpoints, casual spoken language, and limited labeled data. In this paper, we propose CorDial, aiming to improve the abstractive dialogue summarization quality and at the same time enable granularity controllability. We propose 1) a coarse-to-fine generation strategy that generates a summary draft followed by a final summary in an autoregressive way. The summary draft, which provides weakly-supervised signals, is composed of pseudo-labeled interrogative pronoun categories and noisy key phrases extracted with a constituency parser. 2) A simple strategy to control the granularity of the final summary. CorDial can predict and control the number of summary sentences for a given dialogue by predicting and highlighting different text spans from the source text. Our model achieves state-of-the-art performance on the largest dialogue summarization corpus SAMSum. We conduct comprehensive error analysis and show competitive human evaluation results to annotated summaries.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
One-sentence Summary: We propose CorDial, a state-of-the-art dialogue summarization model with coarse-to-fine generation and granularity controllability.
Supplementary Material: zip
Reviewed Version (pdf): https://openreview.net/references/pdf?id=bwA3PAgs-e
9 Replies

Loading