ICLFP-NMT: Neural Machine Translation for ICL Flexible Prompt

Published: 01 Jan 2024, Last Modified: 05 Jun 2025ICIC (LNAI 3) 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Common LLMs Prompt-based Neural Machine Translation methods use discrete prompt words and cured template styles, which are not conducive to fine-tuning of LLMs and contextual feature extraction. In addition, the selection of prompt instances is also a major factor affecting Prompt-NMT performance. Therefore, we propose a flexible prompt method. Specifically, we construct a dual encoder-based soft prototype, which combines spatial clustering and maximum margin constraints to generate prompt instances. Meanwhile, this paper gives a virtual template generation method, which utilizes a pseudo-prompt encoder to adapt to the current translation episodic and build a virtual prompt template, it alleviates the instance selection problem in the ICL method and also improves the template style curing problem. In the translation task of CCMT, the BLEU scores of our model are significantly improved compared with the baseline system, which fully verifies the effectiveness of the proposed method.
Loading