Pseudo-Label Enhanced Prototypical Contrastive Learning for Uniformed Intent Discovery

ACL ARR 2024 April Submission279 Authors

15 Apr 2024 (modified: 20 May 2024)ACL ARR 2024 April SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Discovering new intent is a crucial capability for task-oriented dialogue systems. However, achieving feature adaptation on unlabeled data while preserving feature on labeled data proves to be challenging. Existing methods either handle the two processes in a pipeline manner, which exhibit a gap between intent representation and clustering process, or use typical contrastive clustering that overlooks the potential diverse samples within the same class. To address this challenge, we propose a pseudo-label enhanced contrastive and prototype learning model. We iteratively utilize pseudo-labels to explore potential positive samples for contrastive learning and bridge the gap between the representation and clustering.To enable better knowledge transfer, we design a prototype learning method integrating the supervised and pseudo signals from IND and OOD samples. In addition, our method has been proven effective in two different settings of discovering new intents, while existing methods are typically designed for only one setting.Experiments on two benchmark datasets and two task settings demonstrate the effectiveness of our approach.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: task-oriented; domain adaption; pre-training; contrastive learning
Contribution Types: Model analysis & interpretability
Languages Studied: English
Submission Number: 279
Loading