Dialogue Topic Shift Based on Prompt Learning

ACL ARR 2025 February Submission1936 Authors

14 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: The task of dialogue topic shift detection aims to identify whether a topic shift occurs in the current sentence relative to the preceding context during a conversation. Current research often treats n-gram features as equally important; however, the significance of these features actually depends on the specific context, which influences the model's semantic understanding of the entire text. To address this issue, we propose a model based on prompt learning with multi-scale feature attention. Under the guidance of the prompt learning module, the multi-scale feature attention layer is better able to capture textual semantic features, thereby improving the accuracy of topic shift detection in dialogues. The proposed model was evaluated on the Chinese CNTD dataset and the English TIAGE dataset. Experimental results demonstrate that our model achieves significant performance improvements compared to existing approaches. Furthermore, we compared multi-scale and single-scale feature attention models and found that the optimal performance was achieved when k was set to 4. Finally, we conducted ablation studies and analyses to validate the effectiveness and robustness of the model, resulting in performance enhancements to varying degrees.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: dialogue state tracking
Languages Studied: English; Chinese
Submission Number: 1936
Loading