Building Multi-turn Query Interpreters for E-commercial Chatbots with Sparse-to-dense Attentive Modeling
Abstract: Predicting query intents is crucial for understanding user demands in chatbots. In real-world applications, accurate query intent classification can be highly challenging as human-machine interactions are often conducted in multiple turns, which requires the models to capture related information from the entire contexts. In addition, query intents tend to be fine-grained (up to hundreds of classes), containing lots of casual chats without clear intents. Hence, it is difficult for standard transformer-based models to capture complicated language characteristics of dialogues to support these applications. In this demo, we present AliMeTerp, a multi-turn query interpretation system, which can be seamlessly integrated into e-commercial chatbots in order to generate appropriate responses. Specifically, in AliMeTerp, we introduce SAM-BERT, a pre-trained language model for fine-grained query intent understanding, based on Sparse-to-dense Attentive Modeling. For model pre-training, a stack of Sparse-to-dense Attentive Encoders are employed to model the complicated dialogue structures from different levels. We further design Hierarchical Multi-grained Classification tasks for model fine-tuning. Experiments show SAM-BERT consistently outperforms strong baselines over multiple multi-turn chatbot datasets. We further show how AliMeTerp is deployed in real-world e-commercial chatbots to support real-time customer service.
0 Replies
Loading