Towards Building Accurate End-to-End Task-Oriented Dialog Systems with a Simple CacheDownload PDF

Anonymous

17 Apr 2022 (modified: 05 May 2023)ACL ARR 2022 April Blind SubmissionReaders: Everyone
Abstract: End-to-end task-oriented dialog (TOD) systems have achieved promising performance by leveraging sophisticated natural language understanding and natural language generation capabilities of pre-trained models. This work enables the TOD systems with higher flexibility with a simple cache. The cache provides the flexibility to dynamically update the TOD systems to disable existing or add new and unseen domains, intents, slots, etc., without intensive retraining. Towards this end, we first fine-tune a retrieval module to retrieve Top-$N$ slot information entries from the cache correctly and then train generative end-to-end TOD models with the cache. While performing TOD generation, the models could refer to and ground on both dialog history and the retrieved information. The introduced cache is easy to construct, and the backbone models of TOD systems are compatible with existing pre-trained generative models. Extensive experiments demonstrate the superior performance of our proposed end-to-end framework over baselines, e.g., the Non-Empty JAG is improved by $6.67\%$ when compared with BART-Large.
Paper Type: long
0 Replies

Loading