Abstract: In recent years, neural machine translation (NMT) has achieved significant breakthroughs due to the widespread deployment of large language models (LLMs) and corresponding prompt methods. Though powerful and efficient, the complexities and enormous computational costs for training scaled-up models have caused severe inconvenience. To this end, we propose a simple but efficient approach named Automated Prompt NMT (AP-NMT), which contains an automatic prompt construction module, and further utilizes generated prompts to incorporate syntactic information in training portable-sized Transformer-based translation model. In this way, our proposed method enables the NMT model to obtain comprehensive knowledge of the target language pattern without model structure expansion. Extensive experiments on low-resource IWSLT datasets demonstrate the effectiveness of AP-NMT in improving translation accuracy. We further provide a comprehensive analysis of our method’s influence factors and detailed benefits.
Loading