everyone
since 04 Oct 2024">EveryoneRevisionsBibTeXCC BY 4.0
This study addresses the challenge of efficient human activity recognition (HAR) with limited training data. We propose GEAR-FEN (Generalized Activity Recognition Feature Extraction Network), a novel transfer learning method that transforms kinematic motion signals into a generalized feature space. GEAR-FEN potentially outperforms the state-of-the-art in scenarios with limited training data. This was demonstrated through an evaluation across 11 public HAR datasets (encompassing number of activities ranging from 6 to 33 and number of samples per activity ranging from 8628 to 1140258), using a deep learning model based on convolutional neural networks (CNN), residual bi-directional long short-term memory (ResBiLSTM), and an attention mechanism. Furthermore, we establish the generalizability of our method through performance comparisons on an independent dataset encompassing a distinct population and diverse kinematic modalities for 8 activities, and 26121 samples per activity. These findings highlight the potential of our proposed approach in robust feature representation for HAR tasks with limited data sizes.