CTL-Prompt: Contrastive Topic-Length Prompt Learning for Dialogue Summarization

ACL ARR 2024 June Submission1408 Authors

14 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: The prevalence of online meeting has highlighted the necessity of dialogue summary. Topic summarization is one domain attracted much interest from industry. Anyhow, past work either use topic- or length-prompt which tend to generate almost identical summaries across similar and even different topics. This study proposes Contrastive Topic-Length Prompt Learning (CTL-Prompt), a simple method that generates topic-based summaries. To produce concise yet diverse summaries across topics, we propose contrastive learning on topic-length prompts, which leverages positive and negative pairs to allow the models to learn the similarities and differences of topics. Results showed that our model outperformed baseline models in the ROUGE, BERTscores, and human evaluation scores on the DialogSum and the MACSum dataset. Our work can be found at [anonymized].
Paper Type: Long
Research Area: Summarization
Research Area Keywords: Dialogue Summarization, Topic Guided Summary, Contrastive Learning, Prompt
Contribution Types: NLP engineering experiment
Languages Studied: English
Submission Number: 1408
Loading