CARE: Learning Adaptive Counseling Strategies through Cognitive Architecture Simulation and Reflective Evolution
Keywords: Cognitive Architecture, Counseling Dialogue Systems, Client Simulation, Strategy Evolution
Abstract: Large language models (LLMs) offer potential as scalable solutions for mental health support. However, existing LLM counselors struggle to dynamically adapt strategies to fulfill individual client needs. In this work, we aim to design an automatic evolution system for LLM counselors, leveraging the capabilities of frontier models. A natural solution is to simulate diverse clients and allow the counselor to improve through reflection on these simulated interactions. However, a major challenge is that LLM-simulated clients are systematically biased. For instance, they can be sycophantic, tend to accept suggestions readily, and rarely exhibit the resistance common in real counseling. This bias prevents the counselor from learning to handle potential difficulties. We introduce \textbf{CARE} (\textbf{C}ognitive \textbf{A}rchitecture for \textbf{R}eflective counselor \textbf{E}volution). CARE employs a Cognitive Architecture to explicitly model client internal states that evolve as the conversation progresses. Client responses are generated conditioned on the internal states, forcing resistance when states indicate distrust. This architecture also enables automatic counselor evolution through reflection. Since internal states can be exposed after simulation, we can identify precisely which counselor utterances triggered negative state shifts and use this signal to automatically refine counseling strategies without human annotation. Experiment results demonstrate that our method, CARE, achieves superior performance in simulation experiments, with the highest problem resolution rate(27.1\%) and emotional improvements(39.2\%). Our intervention trials further onfirms the practical potential of CARE.
Paper Type: Long
Research Area: Dialogue and Interactive Systems
Research Area Keywords: conversational modeling
Contribution Types: NLP engineering experiment
Languages Studied: English, Chinese
Submission Number: 9178
Loading