An Empirical Study of Language Relatedness for Transfer Learning in Neural Machine TranslationDownload PDF

08 Feb 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: Neural Machine Translation (NMT) is known to outperform Phrase Based Sta- tistical Machine Translation (PBSMT) for resource rich language pairs but not for resource poor ones. Transfer Learning (Zoph et al., 2016) is a simple approach in which we can simply initialize an NMT model (child model) for a resource poor language pair using a previously trained model (parent model) for a resource rich language pair where the target languages are the same. This paper explores how dif- ferent choices of parent models affect the performance of child models. We empiri- cally show that using a parent model with the source language falling in the same or linguistically similar language family as the source language of the child model is the best.
0 Replies

Loading