Adapting Attention-Based Neural Network to Low-Resource Mongolian-Chinese Machine Translation

Published: 01 Jan 2016, Last Modified: 21 May 2025NLPCC/ICCPOL 2016EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Neural machine translation (NMT) has shown very promising results for some resourceful languages like En-Fr and En-De. The success partly relies on the availability of large scale and high quality parallel corpora. We research on how to adapt NMT to very low-resource Mongolian-Chinese machine translation by introducing attention mechanism, sub-words translation, monolingual data and a NMT correction model. We proposed a sub-words model to address the out-of-vocabulary (OOV) problem in attention-based NMT model. Monolingual data help alleviate the low-resource problem. Besides, we explore a Chinese NMT correction model to enhance the translation performance. The experiments show that the adapted Mongolian-Chinese attention-based NMT machine translation obtains an improvement of 1.70 BLEU points over the phrased-based statistical machine translation baseline and 3.86 BLEU points over normal NMT baseline on an open training set.
Loading