Neighbors Are Not Strangers: Improving Non-Autoregressive Translation under Low-Frequency Lexical ConstraintsDownload PDF


08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Blind SubmissionReaders: Everyone
Paper Link:
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Abstract: Lexically constrained neural machine translation (NMT) draws much industrial attention for its practical usage in specific domains. However, current autoregressive approaches suffer from high latency. In this paper, we focus on non-autoregressive translation (NAT) for this problem for its efficiency advantage. We identify that current constrained NAT models, which are based on iterative editing, do not handle low-frequency constraints well. To this end, we propose a plug-in algorithm for this line of work, i.e., Aligned Constrained Training (ACT), which alleviates this problem by familiarizing the model with the source-side context of the constraints. Experiments on the general and domain datasets show that our model improves over the backbone constrained NAT model in constraint preservation and translation quality, especially for rare constraints.
Copyright Consent Signature (type Name Or NA If Not Transferrable): Jiangjie Chen
Copyright Consent Name And Address: Fudan University. No. 2005 Songhu Road, Yangpu District, Shanghai, China.
0 Replies