Learning Articulated Rigid Body Dynamics Simulations From VideoDownload PDF

Published: 25 Mar 2022, Last Modified: 05 May 2023ICLR2022 OSC PosterReaders: Everyone
Keywords: learning simulators, differentiable simulation, inverse rendering, real2sim
TL;DR: We learn articulated rigid body simulators from RGB or depth video.
Abstract: Being able to reproduce physical phenomena, ranging from light interaction to contact mechanics, simulators are becoming increasingly useful to more and more application domains where real-world interaction or labeled data is difficult to obtain. Despite the gain in attention, it requires significant human effort to configure simulators to accurately reproduce real-world behaviors. We introduce a pipeline that combines inverse rendering with differentiable simulation to create digital twins of real-world articulated mechanisms from depth or RGB videos. Our approach automatically discovers joint types and estimates their kinematic parameters, while the dynamic properties of the overall mechanism are tuned to attain physically accurate simulations. On a real-world coupled pendulum system observed through RGB video, we correctly determine its articulation and simulation parameters, such that its motion can be reproduced accurately in a physics engine. Having learned a simulator from depth video, we demonstrate on a simulated cartpole that a model-predictive controller can leverage such dynamics model to control nonlinear systems.
3 Replies