Offline bilingual word vectors, orthogonal transformations and the inverted softmax

Samuel L. Smith, David H. P. Turban, Steven Hamblin, Nils Y. Hammerla

Nov 04, 2016 (modified: Feb 13, 2017) ICLR 2017 conference submission readers: everyone
  • Abstract: Usually bilingual word vectors are trained "online''. Mikolov et al. showed they can also be found "offline"; whereby two pre-trained embeddings are aligned with a linear transformation, using dictionaries compiled from expert knowledge. In this work, we prove that the linear transformation between two spaces should be orthogonal. This transformation can be obtained using the singular value decomposition. We introduce a novel "inverted softmax" for identifying translation pairs, with which we improve the precision @1 of Mikolov's original mapping from 34% to 43%, when translating a test set composed of both common and rare English words into Italian. Orthogonal transformations are more robust to noise, enabling us to learn the transformation without expert bilingual signal by constructing a "pseudo-dictionary" from the identical character strings which appear in both languages, achieving 40% precision on the same test set. Finally, we extend our method to retrieve the true translations of English sentences from a corpus of 200k Italian sentences with a precision @1 of 68%.
  • TL;DR: We show that a linear transformation between word vector spaces should be orthogonal and can be obtained analytically using the SVD, and introduce the inverted softmax for information retrieval.
  • Keywords: Natural language processing, Transfer Learning, Applications
  • Conflicts: babylonhealth.com, cam.ac.uk

Loading