Entailment Graph Learning with Textual Entailment and Soft TransitivityDownload PDF

Published: 28 Apr 2022, Last Modified: 22 Oct 2023DLG4NLP 2022 PosterReaders: Everyone
Keywords: entailment graph, graph construction, inference, natural language process
TL;DR: The paper proposes a new entailment graph learning method with RTE LM and soft transitivity constraints.
Abstract: Typed entailment graphs try to learn the entailment relations between predicates from text and model them as edges between predicate nodes. The construction of entailment graphs usually suffers from severe sparsity and unreliability of distributional similarity. We propose a two-stage method, Entailment Graph with Textual Entailment and Transitivity (EGT2). EGT2 learns the local entailment relations by recognizing the textual entailment between template sentences formed by typed CCG-parsed predicates. Based on the generated local graph, EGT2 then uses three novel soft transitivity constraints to consider the logical transitivity in entailment structures. Experiments on benchmark datasets show that EGT2 can well model the transitivity in entailment graph to alleviate the sparsity, and leads to significant improvement over current state-of-the-art methods. The released paper can be found in https://arxiv.org/abs/2204.03286.
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/arxiv:2204.03286/code)
0 Replies

Loading