Unsupervised Learning of Entailment-Vector Word EmbeddingsDownload PDF

15 Feb 2018 (modified: 15 Feb 2018)ICLR 2018 Conference Blind SubmissionReaders: Everyone
Abstract: Entailment vectors are a principled way to encode in a vector what information is known and what is unknown. They are designed to model relations where one vector should include all the information in another vector, called entailment. This paper investigates the unsupervised learning of entailment vectors for the semantics of words. Using simple entailment-based models of the semantics of words in text (distributional semantics), we induce entailment-vector word embeddings which outperform the best previous results for predicting entailment between words, in unsupervised and semi-supervised experiments on hyponymy.
TL;DR: We train word embeddings based on entailment instead of similarity, successfully predicting lexical entailment.
Keywords: word embeddings, natural language semantics, entailment, unsupervised learning, distributional semantics
7 Replies

Loading