Normalized Word Embedding and Orthogonal Transform for Bilingual Word TranslationDownload PDF

2015 (modified: 04 Sept 2019)HLT-NAACL 2015Readers: Everyone
Abstract: Word embedding has been found to be highly powerful to translate words from one language to another by a simple linear transform. However, we found some inconsistence among the objective functions of the embedding and the transform learning, as well as the distance measurement. This paper proposes a solution which normalizes the word vectors on a hypersphere and constrains the linear transform as an orthogonal transform. The experimental results confirmed that the proposed solution can offer better performance on a word similarity task and an English-toSpanish word translation task.
0 Replies

Loading