Infinite Dimensional Word EmbeddingsDownload PDF

29 Mar 2024 (modified: 15 Feb 2017)ICLR 2017 workshop submissionReaders: Everyone
Abstract: We describe a method for learning word embeddings with data-dependent dimensionality. Our Infinite Skip-Gram (iSG) and Infinite Continuous Bag-of-Words (iCBOW) are nonparametric analogs of Mikolov et al.'s (2013) well-known 'word2vec' models. Vectors are made infinite dimensional by employing techniques used by Cote & Larochelle (2016) to define a RBM with an infinite number of hidden units. We show qualitatively and quantitatively that the iSG and iCBOW are competitive with their fixed-dimension counterparts while having the ability to infer the appropriate capacity of each word representation.
TL;DR: Word embeddings with stochastic dimensionality.
Keywords: Natural language processing, Unsupervised Learning
Conflicts: uci.edu, princeton.edu
3 Replies

Loading