XQ-MEval: A Dataset with Cross-lingual Parallel Quality for Benchmarking Translation Metrics

ACL ARR 2026 January Submission5128 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Multilingual Translation Evaluation, Automatic Evaluation Metrics, Cross-lingual Scoring Bias, Benchmark
Abstract: Automatic evaluation metrics are essential for building multilingual translation systems. The common practice of evaluating these systems is averaging metric scores across languages, yet this is suspicious since metrics may suffer from cross-lingual scoring bias, where translations of equal quality receive different scores across languages. This problem has not been systematically studied because no benchmark exists that provides parallel-quality instances across languages, and expert annotation is not realistic. In this work, we propose XQ-MEval, a semi-automatically built dataset covering nine translation directions, to benchmark translation metrics. Specifically, we inject MQM-defined errors into gold translations automatically, filter them by native speakers for reliability, and merge errors to generate pseudo translations with controllable quality. These pseudo translations are then paired with corresponding sources and references to form triplets used in assessing the qualities of translation metrics. Using XQ-MEval, our experiments on nine representative metrics reveal the inconsistency between averaging and human judgment and provide the first empirical evidence of cross-lingual scoring bias. Finally, we propose a normalization strategy derived from XQ-MEval that aligns score distributions across languages, improving the fairness and reliability of multilingual metric evaluation.
Paper Type: Long
Research Area: Multilinguality and Language Diversity
Research Area Keywords: multilingual benchmarks, multilingual evaluation, metrics
Contribution Types: NLP engineering experiment, Data resources
Languages Studied: Chinese, Japanese, Lao, Vietnamese, Indonesian, French, Spanish, Sinhala, German, English
Submission Number: 5128
Loading