Crossing the Threshold: Idiomatic Machine Translation through Retrieval Augmentation and Loss Weighting

Published: 07 Oct 2023, Last Modified: 01 Dec 2023EMNLP 2023 MainEveryoneRevisionsBibTeX
Submission Type: Regular Long Paper
Submission Track: Machine Translation
Keywords: machine translation, idioms, multi-word expressions, retrieval-based machine translation
TL;DR: Idioms are hard to translate for neural machine translation systems. We investigate when this is the case, and propose some simple methods to translate them correctly while not degrading normal translation.
Abstract: Idioms are common in everyday language, but often pose a challenge to translators because their meanings do not follow from the meanings of their parts. Despite significant advances, machine translation systems still struggle to translate idiomatic expressions. We provide a simple characterization of idiomatic translation and related issues. This allows us to conduct a synthetic experiment revealing a tipping point at which transformer-based machine translation models correctly default to idiomatic translations. To expand multilingual resources, we compile a dataset of ~4k natural sentences containing idiomatic expressions in French, Finnish, and Japanese. To improve translation of natural idioms, we introduce two straightforward yet effective techniques: the strategic upweighting of training loss on potentially idiomatic sentences, and using retrieval-augmented models. This not only improves the accuracy of a strong pretrained MT model on idiomatic sentences by up to 13\% in absolute accuracy, but also holds potential benefits for non-idiomatic sentences.
Submission Number: 4715
Loading