Abstract: Task-oriented dialogue systems often face challenges when operating in dynamic environments where domains change frequently, affecting their performance. This paper introduces the Domain Change Simulator (DCS), a novel framework designed to simulate domain changes and evaluate their impact on dialogue systems. The simulator allows controlled experimentation with various types and magnitudes of domain shifts, providing valuable insights for system developers. In addition to simulating domain changes, the framework integrates Generative Dialogue Domain Adaptation (G-DDA), utilizing large language models to dynamically generate slot-value substitutions. This approach enhances the system’s adaptability to new domain contexts without requiring extensive retraining. Through a series of experiments on the MultiWOZ dataset, we demonstrate how the DCS enables precise predictions of system performance under evolving domains, offering a robust tool for improving the resilience of task-oriented dialogue agents. Our results highlight the potential of generative models in maintaining system coherence and domain adherence, even in the face of substantial domain shifts.
External IDs:doi:10.1142/s2196888825500162
Loading