Dynamics-regulated kinematic policy for egocentric pose estimationDownload PDF

Published: 09 Nov 2021, Last Modified: 25 Nov 2024NeurIPS 2021 PosterReaders: Everyone
Keywords: Egocentric human pose estimation, Physics Simulation, augmented reality, humanoid control, kinematics and dynamics
Abstract: We propose a method for object-aware 3D egocentric pose estimation that tightly integrates kinematics modeling, dynamics modeling, and scene object information. Unlike prior kinematics or dynamics-based approaches where the two components are used disjointly, we synergize the two approaches via dynamics-regulated training. At each timestep, a kinematic model is used to provide a target pose using video evidence and simulation state. Then, a prelearned dynamics model attempts to mimic the kinematic pose in a physics simulator. By comparing the pose instructed by the kinematic model against the pose generated by the dynamics model, we can use their misalignment to further improve the kinematic model. By factoring in the 6DoF pose of objects (e.g., chairs, boxes) in the scene, we demonstrate for the first time, the ability to estimate physically-plausible 3D human-object interactions using a single wearable camera. We evaluate our egocentric pose estimation method in both controlled laboratory settings and real-world scenarios.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
TL;DR: We propose a method for estimating physically plausible 3D human poses and human-object interactions from an egocentric camera.
Supplementary Material: zip
Code: https://github.com/KlabCMU/kin-poly
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 2 code implementations](https://www.catalyzex.com/paper/dynamics-regulated-kinematic-policy-for/code)
14 Replies

Loading