IntentGPT: Few-shot Intent Discovery with Large Language Models

Published: 11 Mar 2024, Last Modified: 22 Apr 2024LLMAgents @ ICLR 2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: LLM, few-shot, intent discovery
TL;DR: We present IntentGPT, a training-free method for intent discovery in dialogue systems, able to assign user queries to known classes (intents) or discovering new ones. We leverage GPT-4 and novel few-shot techniques, outperforming previous methods.
Abstract: In today's digitally driven world, dialogue systems play a pivotal role in enhancing user interactions, from customer service to virtual assistants. In these dialogues, it is important to identify user's goals automatically to resolve their needs promptly. This has necessitated the integration of models that perform Intent Detection. However, users' intents are diverse and dynamic, making it challenging to maintain a fixed set of predefined intents. As a result, a more practical approach is to develop a model capable of identifying new intents as they emerge. We address the challenge of Intent Discovery, an area that has drawn significant attention in recent research efforts. Existing methods need to train on a substantial amount of data for correctly identifying new intents, demanding significant human effort. To overcome this, we introduce IntentGPT, a novel method that efficiently prompts Large Language Models (LLMs) such as GPT-4 to effectively discover new intents with minimal labeled data. IntentGPT comprises an In-Context Prompt Generator, which generates informative prompts for In-Context Learning, an Intent Predictor for classifying and discovering user intents behind utterances, and a Semantic Few-Shot Sampler which leverages embedding similarities for selecting the closest examples from the labeled data. Our experiments show that IntentGPT outperforms previous methods that require extensive domain-specific data and fine-tuning, in popular benchmarks, including CLINC and BANKING.
Submission Number: 103
Loading