An Empirical study of Unsupervised Neural Machine Translation: analyzing NMT output, model's behavior and sentences' contributionDownload PDF

Anonymous

16 Oct 2023ACL ARR 2023 October Blind SubmissionReaders: Everyone
Abstract: Unsupervised Neural Machine Translation (UNMT) focuses on improving NMT results under the assumption there is no human translated parallel data, yet little work has been done so far in highlighting its advantages compared to supervised methods and analyzing its output in aspects other than translation accuracy. We focus on three very diverse languages, French, Gujarati, and Kazakh, and train bilingual NMT models, to and from English, with various levels of supervision, in high- and low- resource setups, measure quality of the NMT output and compare the generated sequences word order and semantic similarity to source and reference sentences. We also use Layer-wise Relevance Propagation to analyze the model’s behavior during training, and evaluate the source and target sentences’ contribution to the NMT result, expanding the findings of previous works to the UNMT paradigm.
Paper Type: long
Research Area: Machine Translation
Contribution Types: NLP engineering experiment
Languages Studied: French, Gujarati, Kazakh
Consent To Share Submission Details: On behalf of all authors, we agree to the terms above to share our submission details.
0 Replies

Loading