Modeling Future for Neural Machine Translation by Fusing Target InformationDownload PDF

Anonymous

17 Sept 2021 (modified: 05 May 2023)ACL ARR 2021 September Blind SubmissionReaders: Everyone
Abstract: Sequence-to-sequence Neural Machine Translation (NMT) models have achieved excellent performance. However, the NMT decoder only makes predictions based on the source and the target historical context, ignores the target future information completely, leading to a problem that NMT does not consider potential future information when making decisions. To alleviate this problem, we propose a simple and effective {\bf Fu}ture-fused {\bf NMT} model called \textsc{FuNMT}, which introduces a reverse decoder to explicitly model the target future information, then adopts an agreement mechanism to enable the forward decoder to learn this future information. Empirical studies on multiple benchmarks show that our proposed model significantly improves translation quality.
0 Replies

Loading