Abstract: Emotion recognition from gait has gained significant interest due to its applicability in different fields such as healthcare, social cues, surveillance, and smart applications. Gait, as a biometric trait, offers unique advantages, allowing remote identification and robust recognition even in uncontrolled scenarios. Moreover, gait analysis can provide valuable insights into an individual’s emotional state. This work presents the “Walk-as-you-Feel” (WayF) framework, a novel approach for gait-based emotion recognition that does not rely on facial cues, ensuring user privacy. To address challenges with small and unbalanced datasets, a balancing procedure suitable for deep learning architecture is also developed. Adapted Inception-v3 and EfficientNet are employed for the feature extraction phase. Classification is performed using a Gated Recurrent Units network (GRUs) and Transformers-Encoder. Experimental results demonstrate the competitiveness of the proposed approach with respect to state-of-the-art works which also integrate facial cues. WayF reaches an average recognition rate of approximately 77% in its best configuration. Moreover, when excluding the neutral emotion, the proposed method achieves an outstanding overall accuracy of 83.3%.
External IDs:dblp:journals/eaai/BisogniCNPP24
Loading