Keywords: Leaderboards, LLM, Evaluation, Benchmarking
TL;DR: We introduce HUMAINE, a new evaluation framework for LLMs that uses demographically stratified sampling and multi-turn conversations to reveal significant performance differences across user demographics.
Abstract: The evaluation of large language models faces significant challenges. Technical benchmarks often lack real-world relevance, while existing human preference evaluations suffer from unrepresentative sampling, superficial assessment depth, and single-metric reductionism. To address these issues, we introduce HUMAINE, a framework for multidimensional, demographically aware measurement of human-AI interaction. We collected multi-turn, naturalistic conversations from 23,404 participants that were stratified across 22 demographic groups, both in the US and UK, to evaluate 28 state-of-the-art models across five human-centric dimensions. We use a hierarchical Bayesian Bradley-Terry-Davidson (BTD) model, with post-stratification to census data, and our analysis reveals three key insights. $\textbf{(1)}$ We establish a clear performance hierarchy where $\texttt{google/gemini-2.5-pro}$ ranks first overall, with a 95.6\% posterior probability of being the top-ranked model. $\textbf{(2)}$ We uncover significant preference heterogeneity, with user age emerging as the primary demographic axis of disagreement; a model's perceived rank can shift substantially across age groups, exposing failures in generalisation that unrepresentative samples typically mask. $\textbf{(3)}$ We quantify the vast difference in discriminative power across evaluation dimensions, with ambiguous qualities like Trust, Ethics and Safety showing a 65\% tie rate, in stark contrast to the decisive 10\% tie rate for Overall Winner. Our work emphasises the need for a more multidimensional, demographically aware perspective in LLM evaluation. We release our complete dataset, interactive leaderboard, and open-source framework.
Primary Area: datasets and benchmarks
Submission Number: 19192
Loading