Capturing Minds, Not Just Words: Enhancing Role-Playing Language Models with Personality-Indicative Data

ACL ARR 2024 June Submission3857 Authors

16 Jun 2024 (modified: 02 Jul 2024)ACL ARR 2024 June SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Role-playing agents (RPA) have been a popular application area for large language models (LLMs), attracting significant interest from both industry and academia. While existing RPAs well portray the characters' knowledge and tones, they face challenges in capturing their minds, especially for small role-playing language models (RPLMs). In this paper, we propose to enhance RPLMs via personality-indicative data. Specifically, we leverage questions from psychological scales and distill advanced RPAs to generate dialogues that grasp the minds of characters. Experimental results validate that RPLMs trained with our dataset exhibit advanced role-playing capabilities for both general and personality-related evaluations.
Paper Type: Short
Research Area: Dialogue and Interactive Systems
Research Area Keywords: spoken dialogue systems; task-oriented; interactive storytelling; embodied agents;
Contribution Types: NLP engineering experiment, Publicly available software and/or pre-trained models, Data resources
Languages Studied: English,Chinese
Submission Number: 3857
Loading