Abstract: Gait is a fundamental aspect of human mobility, and disruptions in normal gait can significantly reduce quality of life (QOL). Although recent advances in 3D gait analysis (3DGA) enable precise, quantitative assessments, these methods are typically confined to controlled laboratory environments and thus fail to accurately capture natural gait variability. Conversely, wearable IMU sensors offer cost-effective, portable solutions for capturing movements across diverse settings but face challenges such as invasiveness and sensor drift. In this study, we propose “Gait Inertial Poser (GIP),” a novel method estimating 3D full-body pose during straight walking on flat ground, using only two shoe-embedded IMU sensors. GIP initially estimates personalized body shapes from user attributes (height, weight, age, gender) and then employs a Transformer-based module to infer gait motion parameters from IMU data. To ensure temporal continuity and smoothness of the estimated motion, we further introduce a smoothing module based on a Variational Autoencoder (VAE), which further incorporates a specialized loss function that explicitly enforces kinematic constraints during foot-ground contact, thereby improving the overall estimation accuracy. Comprehensive experiments conducted on two public datasets quantitatively and qualitatively demonstrate that GIP achieves high accuracy in straight-line walking. This approach overcomes limitations of traditional laboratory-based methods, opening new opportunities for real-time monitoring and remote rehabilitation in everyday environments. The code will be available at https://github.com/RyosukeHori/GaitInertialPoser
Loading