Understanding Composition of Word Embeddings via Tensor DecompositionDownload PDF

Published: 21 Dec 2018, Last Modified: 21 Apr 2024ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: Word embedding is a powerful tool in natural language processing. In this paper we consider the problem of word embedding composition \--- given vector representations of two words, compute a vector for the entire phrase. We give a generative model that can capture specific syntactic relations between words. Under our model, we prove that the correlations between three words (measured by their PMI) form a tensor that has an approximate low rank Tucker decomposition. The result of the Tucker decomposition gives the word embeddings as well as a core tensor, which can be used to produce better compositions of the word embeddings. We also complement our theoretical results with experiments that verify our assumptions, and demonstrate the effectiveness of the new composition method.
Keywords: word embeddings, semantic composition, tensor decomposition
TL;DR: We present a generative model for compositional word embeddings that captures syntactic relations, and provide empirical verification and evaluation.
Code: [![github](/images/github_icon.svg) abefrandsen/syntactic-rand-walk](https://github.com/abefrandsen/syntactic-rand-walk)
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 1 code implementation](https://www.catalyzex.com/paper/arxiv:1902.00613/code)
8 Replies

Loading