A Differentiable Self-disambiguated Sense Embedding Model via Scaled Gumbel SoftmaxDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Blind SubmissionReaders: Everyone
Abstract: We present a differentiable multi-prototype word representation model that disentangles senses of polysemous words and produces meaningful sense-specific embeddings without external resources. It jointly learns how to disambiguate senses given local context and how to represent senses using hard attention. Unlike previous multi-prototype models, our model approximates discrete sense selection in a differentiable manner via a modified Gumbel softmax. We also propose a novel human evaluation task that quantitatively measures (1) how meaningful the learned sense groups are to humans and (2) how well the model is able to disambiguate senses given a context sentence. Our model outperforms competing approaches on both human evaluations and multiple word similarity tasks.
Keywords: unsupervised representation learning, sense embedding, word sense disambiguation, human evaluation
TL;DR: Disambiguate and embed word senses with a differentiable hard-attention model using Scaled Gumbel Softmax
12 Replies

Loading