Infinite Dimensional Word Embeddings

Eric Nalisnick, Sachin Ravi

Invalid Date (modified: Feb 15, 2017) ICLR 2017 workshop submission readers: everyone
  • Abstract: We describe a method for learning word embeddings with data-dependent dimensionality. Our Infinite Skip-Gram (iSG) and Infinite Continuous Bag-of-Words (iCBOW) are nonparametric analogs of Mikolov et al.'s (2013) well-known 'word2vec' models. Vectors are made infinite dimensional by employing techniques used by Cote & Larochelle (2016) to define a RBM with an infinite number of hidden units. We show qualitatively and quantitatively that the iSG and iCBOW are competitive with their fixed-dimension counterparts while having the ability to infer the appropriate capacity of each word representation.
  • TL;DR: Word embeddings with stochastic dimensionality.
  • Keywords: Natural language processing, Unsupervised Learning
  • Conflicts:,
0 Replies