Extending Multilingual Machine Translation through Imitation LearningDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
TL;DR: Extending MNMT model without severe catastrophic forgetting only using parallel data from new language and English
Abstract: Despite the growing variety of languages supported by existing multilingual neural machine translation (MNMT) models, most of the world's languages are still being left behind. We aim to extend large-scale MNMT models to a new language, allowing for translation between the newly added and all of the already supported languages in a challenging scenario: using only a parallel corpus between the new language and English. Previous approaches, such as continued training on parallel data including the new language, suffer from catastrophic forgetting (i.e., performance on other languages is reduced). Our novel approach Imit-MNMT treats the task as an imitation learning process, which mimicks the behavior of an expert, a technique widely used in the computer vision area, but not well explored in NLP. More specifically, we construct a pseudo multi-parallel corpus of the new and the original languages by pivoting through English, and imitate the output distribution of the original MNMT model. Extensive experiments show that our approach significantly improves the translation performance between the new and the original languages, without severe catastrophic forgetting. We also demonstrate that our approach is capable of solving the copy and off-target problems, which are two common issues in current large-scale MNMT models.
Paper Type: long
Research Area: Machine Translation
Contribution Types: Model analysis & interpretability, Theory
Languages Studied: English, kan, Dinka, Bambara, Chokwe, Dyula, Balinese, Bemba, Banjar
0 Replies

Loading