Abstract: Effective human–robot collaboration (HRC) requires robots to understand and adapt to humans' psychological states. This research presents a novel approach to quantitatively measure human comfort levels during HRC through the development of two metrics: a comfortability index (CI) and an uncomfortability index (UnCI). We conducted HRC experiments where participants performed assembly tasks while the robot's behavior was systematically varied. Participants' subjective responses (including surprise, anxiety, boredom, calmness, and comfortability ratings) were collected alongside physiological signals, including electrocardiogram, galvanic skin response, and pupillometry data. We propose two novel approaches for estimating CI/UnCI: an adaptation of the emotion circumplex model that maps comfort levels to the arousal–valence space, and a kernel density estimation model trained on physiological data. Time-domain features were extracted from the physiological signals and used to train machine learning models for real-time comfort levels estimation. Our results demonstrate that the proposed approaches can effectively estimate human comfort levels from physiological signals alone, with the circumplex model showing particular promise in detecting high discomfort states. This work enables real-time measurement of human comfort during HRC, providing a foundation for developing more adaptive and human-aware collaborative robots.
Loading