Non-autoregressive Machine Translation by Modeling Syntactic Dependency InterrelationDownload PDF

Anonymous

08 Mar 2022 (modified: 05 May 2023)NAACL 2022 Conference Desk Rejected Commitment SubmissionReaders: Everyone
Paper Link: https://openreview.net/forum?id=ZsUmF2Krbt
Paper Type: Long paper (up to eight pages of content + unlimited references and appendices)
Track: Machine Translation
Abstract: Non-autoregressive Transformer (NAT) significantly improves translation efficiency by parallel decoding. However, the poor modeling of word inter-dependencies in NAT models prevents them from organizing consistent modes while learning the one-to-many multi-modality phenomenon. In this paper, we propose inter-nat, which explicitly models the target-side word inter-dependencies for NAT models. We introduce the word inter-dependencies according to the syntactic dependency tree, which presents explicit modification relationships between the words. These dependencies could coordinate the translation of the target sentence and alleviate the multi-modality issue. Experiments results on the WMT14 and WMT16 tasks show that with only one-pass decoding \method achieves comparable or better performance than strong iterative NAT baselines while keeping a competitive efficiency.
Software: zip
Naacl Preprint: yes
Preprint: no
TL;DR: We decomposing and modeling the syntactic dependency interrelation of target sentence, improving its translation quality.
Authorship: I confirm that I am one of the authors of this paper.
Paper Version: I confirm that this link is for the latest version of the paper in ARR that has reviews and a meta-review.
Anonymity Period: I confirm that this submission complies with the anonymity period.
Commitment Note: https://openreview.net/forum?id=H0MlSORHJbc
Author Profiles: I confirm that the OpenReview profiles of all authors are up-to-date (with current email address, institution name, institution domain).
Country Of Affiliation Of Corresponding Author: CN|China
0 Replies

Loading