A Context-aware Attention Network for Interactive Question AnsweringDownload PDF

25 Apr 2024 (modified: 21 Jul 2022)Submitted to ICLR 2017Readers: Everyone
Abstract: We develop a new model for Interactive Question Answering (IQA), using Gated-Recurrent-Unit recurrent networks (GRUs) as encoders for statements and questions, and another GRU as a decoder for outputs. Distinct from previous work, our approach employs context-dependent word-level attention for more accurate statement representations and question-guided sentence-level attention for better context modeling. Employing these mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input. When available, user's feedback is encoded and directly applied to update sentence-level attention to infer the answer. Extensive experiments on QA and IQA datasets demonstrate quantitatively the effectiveness of our model with significant improvement over conventional QA models.
TL;DR: A self-adaptive QA model aware of what it knows and what it does not know for interactive question answering.
Conflicts: uncc.edu, email.arizona.edu, nec-labs.com
Keywords: Deep learning, Natural language processing, Applications
11 Replies

Loading