Abstract: This paper introduces a novel concept for Human Activity Recognition (HAR) that allows robust analysis, classification, and understanding of human movements in various environments. It can be applied in various applications such as Health monitoring and analysis, fitness/dance training and performance analysis, interactive gaming, smart homes, and wearable devices. The novel method, coined as STL-HAR (SpatioTemporal Learning for HAR), learns from sensor data jointly represented in space and time to robustify the HAR process. In the new concept, we propose a hybrid model based on GNN (Graph Neural Network), and LSTM (Long Short-Term Memory). GNN first learns the spatial features from different sensor data locations. The learned features will then be injected to LSTM where the temporal information is captured by observing sensor status at different timestamps. We evaluate and analyze the performance of STL-HAR in real use case scenarios of HAR data compared with baseline HAR-based solutions. STL-HAR has achieved a recognition rate of 92% under different scenarios.