IterNLU: An Iterative Learning Framework for Joint Intent Detection and Slot FillingDownload PDF

Anonymous

04 Mar 2022 (modified: 05 May 2023)Submitted to NLP for ConvAIReaders: Everyone
Keywords: Natural Language Understanding, Intent Detection, Slot Filling, Iterative Learning, Intent-Slot Correlation
TL;DR: We propose a novel joint learning framework for NLU that exploits intent-slot correlation through iterative learning to achieve the state-of-the-art performance (validated on two publicly available datasets).
Abstract: Exploiting the cross-impact between intent detection and slot filling, two main tasks for natural language understanding (NLU), has been a recent trend for NLU research. While the cross-impact has often been modeled implicitly through joint learning, various methods have also been proposed to explicitly use slot information to facilitate intent detection and/or extract intent information to facilitate slot filling. However, previous works haven’t fully explored the potential of the cross-impact yet. To better capture the benefit of intent-slot correlation, this paper proposes a novel joint learning framework, named IterNLU, to iteratively learn intent detection and slot filling with updated slot and intent information fed in, respectively, per iteration. In each iteration, we attempt to extract the most effective intent/slot representation based on available information to assist subsequent detection of slot/intent. Our proposed approach achieves 0.95% and 1.74% absolute gains on semantic frame accuracy over the best previous state-of-the-art NLU approach on two public benchmark datasets, ATIS and Snips, respectively. We also conduct systematic analyses on the effect of feeding in different types of intent/slot information as well as on system efficiency.
0 Replies

Loading