Domain-Slot Aware Contrastive Learning for Improved Dialogue State Tracking

Published: 2024, Last Modified: 21 Jan 2026ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Large-scale pre-trained neural language model has facilitated to achieve the state-of-the-art performance on Dialogue State Tracking (DST) tasks. One of the existing works models the semantic correlation between the dialogue context and (domain, slot) pair encoded by BERT and make the prediction. Despite the effectiveness, they ignore the fact that there is no perfect semantic correspondence between (domain, slot) pair and the dialogue context. In this paper, we propose a domain-slot aware contrastive learning framework to solve this problem, which proposes three methods to bridge the semantic gap between the dialogue context and the (domain, slot) by constructing training sample pairs to fine-tune the BERT model and use it for base DST model. The experiments demonstrate that our proposed method has improved the performance of the baseline model on the MultiWOZ2.1 and MultiWOZ2.4 datasets, yielding competitive results.
Loading