Language Complexity in Multilingual Language Models

ACL ARR 2025 May Submission113 Authors

07 May 2025 (modified: 03 Jul 2025)ACL ARR 2025 May SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Understanding the behavior of multilingual language models across different languages remains a significant challenge. Recent research has demonstrated that the Average Neuron-Wise Correlation (ANC) offers a comprehensive analysis of activation similarities in multilingual models. This study proposes the use of Average Wasserstein Distance (AWD) between activation value distributions and compares it to ANC across three datasets: XNLI, ReadMe++, and Vikidia. By applying these metrics, we aim to elucidate the underlying processes within large language models, thereby enhancing our understanding of cross-linguistic transfer and model accuracy.
Paper Type: Long
Research Area: Multilingualism and Cross-Lingual NLP
Research Area Keywords: cross-lingual transfer, multilingual pre-training, multilingual benchmarks, multilingual evaluation, less-resourced languages
Contribution Types: Model analysis & interpretability
Languages Studied: English, Russian, French, Hindi, Arabic, Romanian, Greek, Spanish, Hebrew, Turkish, Belarusian, Ukrainian, Bulgarian, Chinese, Japanese, Irish, German, Italian, Welsh
Submission Number: 113
Loading