The Expressive Power of Word Embeddings

Yanqing Chen, Bryan P, Rami Al-Rfou, Steven Skiena

Jan 16, 2013 (modified: Jan 16, 2013) ICLR 2013 conference submission readers: everyone
  • Decision: reject
  • Abstract: We seek to better understand the difference in quality of the several publicly released embeddings. We propose several tasks that help to distinguish the characteristics of different embeddings. Our evaluation shows that embeddings are able to capture deep semantics even in the absence of sentence structure. Moreover, benchmarking the embeddings shows great variance in quality and characteristics of the semantics captured by the tested embeddings. Finally, we show the impact of varying the number of dimensions and the resolution of each dimension on the effective useful features captured by the embedding space. Our contributions highlight the importance of embeddings for NLP tasks and the effect of their quality on the final results.