A Context-aware Attention Network for Interactive Question AnsweringDownload PDF

08 Dec 2022, 01:58 (modified: 21 Jul 2022, 19:53)Submitted to ICLR 2017Readers: Everyone
TL;DR: A self-adaptive QA model aware of what it knows and what it does not know for interactive question answering.
Abstract: We develop a new model for Interactive Question Answering (IQA), using Gated-Recurrent-Unit recurrent networks (GRUs) as encoders for statements and questions, and another GRU as a decoder for outputs. Distinct from previous work, our approach employs context-dependent word-level attention for more accurate statement representations and question-guided sentence-level attention for better context modeling. Employing these mechanisms, our model accurately understands when it can output an answer or when it requires generating a supplementary question for additional input. When available, user's feedback is encoded and directly applied to update sentence-level attention to infer the answer. Extensive experiments on QA and IQA datasets demonstrate quantitatively the effectiveness of our model with significant improvement over conventional QA models.
Keywords: Deep learning, Natural language processing, Applications
Conflicts: uncc.edu, email.arizona.edu, nec-labs.com
11 Replies