First the worst: Finding better gender translations during beam searchDownload PDF

Anonymous

16 Nov 2021 (modified: 05 May 2023)ACL ARR 2021 November Blind SubmissionReaders: Everyone
Abstract: Generating machine translations via beam search seeks the most likely output under a model. However, beam search has been shown to amplify demographic biases exhibited by a model. We aim to address this, focusing on gender bias resulting from systematic errors in grammatical gender translation. Almost all prior work on this problem adjusts the training data or the model itself. By contrast, our approach changes only the inference procedure. We explore two techniques: applying constraints during inference to improve gender diversity in n-best lists, and reranking n-best lists using gender features obtained from the source sentence. Combining these methods gives large gains in gender translation accuracy for three language pairs without requiring additional bilingual data or retraining.
0 Replies

Loading