EuroGEST: Investigating gender stereotypes in multilingual language models

ACL ARR 2025 May Submission4620 Authors

20 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Large language models increasingly support multiple languages, yet most benchmarks for gender bias remain English-centric. We introduce EuroGEST, a dataset designed to measure gender-stereotypical reasoning in LLMs across English and 29 European languages. EuroGEST builds on an existing expert-informed benchmark covering 16 gender stereotypes, expanded in this work using translation tools, quality estimation metrics, and morphological heuristics. Human evaluations confirm that our data generation method results in high accuracy of both translations and gender labels across languages. We use EuroGEST to evaluate 24 multilingual language models from six model families, demonstrating that the strongest stereotypes in all models across all languages are that women are 'beautiful', 'empathetic' and 'neat' and men are 'leaders', 'strong', 'tough' and 'professional'. We also show that larger models encode gendered stereotypes more strongly and that instruction finetuning does not consistently reduce gendered stereotypes. Our work highlights the need for more multilingual studies of fairness in LLMs and offers scalable methods and resources to audit gender bias across languages.
Paper Type: Long
Research Area: Ethics, Bias, and Fairness
Research Area Keywords: model bias/fairness evaluation, multilingual benchmarks, multilingual evaluation, resources for less-resourced languages
Contribution Types: Model analysis & interpretability, Approaches to low-resource settings, Data resources
Languages Studied: Bulgarian, Croatian, Czech, Polish, Russian, Slovak, Slovenian, Ukrainian, Danish, Dutch, English, German, Norwegian, Swedish, Catalan, French, Galician, Italian, Portuguese, Romanian, Spanish, Latvian, Lithuanian, Estonian, Finnish, Hungarian, Greek, Irish, Maltese, Turkish.
Submission Number: 4620
Loading