Enhancing Customer Service Chatbots with Context-Aware NLU, Selective Attention and Multi-task Learning for Improved Intent Classification
Abstract: Customer service chatbots serve as conversational systems aimed at addressing customer queries. By directing customers to automated workflows, these chatbots enable faster query resolution. A crucial aspect of this process is classifying the customer's intent. Most existing intent classification models in the customer care domain rely solely on customer queries for prediction, which can be ambiguous and result in reduced model accuracy. For example, a query like "I did not receive my package" could indicate a delayed order, or a delivered order that the user failed to receive, each requiring a different resolution approach. Utilizing additional information, such as the customer's order delivery status, can enhance intent prediction accuracy. In this study, we introduce a context-aware NLU architecture that incorporates both the customer query and the customer's past order history as context. A novel selective attention module extracts relevant context features, leading to improved model accuracy. We also propose a multi-task learning paradigm for the effective utilization of different label types, one based only on user query and the other based on full conversation with human agent. Our suggested method, Multi-Task Learning-Contextual NLU with Selective Attention Weighted Context (MTL-CNLU-SAWC), demonstrated a 4.8% increase in top 2 intent accuracy score compared to the baseline model that only uses user queries, and a 3.5% improvement over existing state-of-the-art models combining query and context.
Paper Type: long
Research Area: NLP Applications
Contribution Types: NLP engineering experiment
Languages Studied: english
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies
Loading