Uncovering Biased Views and Stereotypes in LLM Personas

ACL ARR 2026 January Submission6548 Authors

05 Jan 2026 (modified: 20 Mar 2026)ACL ARR 2026 January SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Emergent Persona, LLM Personas, Political Bias, Religious Bias, Bias, Stereotypes, Multilingual LLMs, LLM
Abstract: In a world where LLMs have become an integral part of our lives, the harm potential of biases and stereotypes in these models is becoming an ever increasing concern. In the case of chatbots, end users observe LLM outputs as if they were conversing with an "LLM persona" which is defined not only by the model architecture and its parameters, but also its instruction and previous user prompts. These influence subjective opinions and biases adopted by those personas and thus bear the risk for harm potential. Our contributions are twofold. First, we provide a framework to assess and quantify the political and religious views as well as personality traits of LLM personas based on questionnaires. We also provide functionality to automatically machine-translate personas and questionnaires to other languages. Second, we systematically analyze how instruction prompts and conversational context shape the emergent persona of an LLM, altering the expression of biases and subjective opinions. We find that LLM personas adapt additional standpoints associated with certain concepts or ideologies, even when not explicitly instructed to do so. This effect can occur even with just anecdotal context where models implicitly build on stereotypes to infer their persona's ideology, and it also depends on the language used. Finally, we observe that the political LLM personas and their stereotypes have a Western bias, even when prompted in Arabic.
Paper Type: Long
Research Area: Ethics, Bias, and Fairness
Research Area Keywords: model bias/fairness evaluation, ethical considerations in NLP applications
Contribution Types: NLP engineering experiment, Data analysis
Languages Studied: English, Arabic
Submission Number: 6548
Loading