Physically Embodied Gaussian Splatting: A Visually Learnt and Physically Grounded 3D Representation for Robotics
Keywords: 3D Representation, Gaussian Splatting, Robotics, Tracking, Physics
TL;DR: We use particles to represent the physical state of a robot's world and 3D Gaussians for its visual state, combining them to enable forward modeling and realtime visual-based correction from images coming from 3 cameras.
Abstract: For robots to robustly understand and interact with the physical world, it is highly beneficial to have a comprehensive representation -- modelling geometry, physics, and visual observations -- that informs perception, planning, and control algorithms. We propose a novel dual "Gaussian-Particle" representation that models the physical world while (i) enabling predictive simulation of future states and (ii) allowing online correction from visual observations in a dynamic world. Our representation comprises particles that capture the geometrical aspect of objects in the world and can be used alongside a particle-based physics system to anticipate physically plausible future states. Attached to these particles are 3D Gaussians that render images from any viewpoint through a splatting process thus capturing the visual state. By comparing the predicted and observed images, our approach generates "visual forces" that correct the particle positions while respecting known physical constraints. By integrating predictive physical modeling with continuous visually-derived corrections, our unified representation reasons about the present and future while synchronizing with reality. We validate our approach on 2D and 3D tracking tasks as well as photometric reconstruction quality. Videos are found at https://embodied-gaussians.github.io/
Supplementary Material: zip
Website: https://embodied-gaussians.github.io/
Publication Agreement: pdf
Student Paper: yes
Submission Number: 199
Loading