Keywords: self-supervised learning, biomechanics, physics-informed neural networks
TL;DR: A method for the estimation of human movement dynamics from unlabeled IMU data based on a physical multibody dynamics model.
Abstract: Accurate real-time monitoring of not only movements, but also internal joint moments or muscle forces that cause movement in unrestricted environments is key for many clinical and sports applications. A minimally obstrusive way to monitor movements is with wearable sensors, such as inertial measurement units, using the fewest sensors possible.
Current real-time methods rely on supervised learning, where a ground truth dataset needs to be measured with laboratory measurement systems, such as optical motion capture, which then needs to be processed with methods that are known to introduce errors. There is a discrepancy between laboratory and real-world movements, and for analysing new motions, new ground truth data would need to be recorded, which is impractical.
Therefore, we introduce SSPINNpose, a self-supervised physics-informed neural network that estimates movement dynamics, including joint angles and joint moments, from inertial sensors without the need for ground truth data for training.
We run the network output through a physics model of the human body to optimize physical plausibility and generate virtual measurement data. Using this virtual sensor data, the network is trained directly on the measured sensor data instead of a ground truth. Experiments show that SSPINNpose is able to accurately estimate joint angles and joint moments at 8.7 degrees and 4.9 BWBH%, respectively, for walking and running at up to speeds of 4.9 m/s at a latency of 3.5 ms. We further show the versatility of our method by estimating movement dynamics for a variety of sparse sensor configurations and inferring the positions where the sensors are placed on the body.
Supplementary Material: zip
Primary Area: neurosymbolic & hybrid AI systems (physics-informed, logic & formal reasoning, etc.)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 511
Loading