Syntax-Aware Attention for Natural Language Inference with Phrase-Level MatchingOpen Website

Published: 2019, Last Modified: 17 May 2023CCL 2019Readers: Everyone
Abstract: Natural language inference (NLI) aims to predict whether a premise sentence can infer another hypothesis sentence. Models based on tree structures have shown promising results on this task, but the performance still falls below that of sequential models. In this paper, we present a syntax-aware attention model for NLI, by which phrase-level matching between two sentences is allowed. We design tree-structured semantic composition function that builds phrase representations according to syntactic trees. We then introduce cross sentence attention to learn interaction information based on phrase-level representations between two sentences. Moreover, we additionally explore a self-attention mechanism to enhance semantic representations by capturing the context from syntactic tree. Experimental results on SNLI and SciTail datasets demonstrate that our model has the ability to model NLI more precisely and significantly improves the performance.
0 Replies

Loading