Abstract: In the success of Transformer architecture in Neural Machine Translation, integrating linguistic features into the traditional systems gains a huge interest in both research and practice. With less increase in computational cost as well as improving the quality of translation, we propose an abstract template integration model to intensify the structural information in source language from syntactic tree. Besides, the previous works have not considered the effect of the template generating mechanism, while this is an essential component of template-based translation. In this work, we investigate various template generating methods and propose two prominent abstract template generation techniques based on the POS information. Together with the strength of Transformer, our proposed approach allows to effectively incorporate and extract the linguistics features to enrich the information in encoding phase. Experiments on several benchmarks prove that our approaches achieve competitive results against the competitive baselines with less effort in training time. Furthermore, our results reflect that syntactic information is the rich fertile ground to have benefited greatly in neural machine translation. Our code is available at https://github.com/phuongnm-bkhn/multisources-trans-nmt .
0 Replies
Loading