Keywords: Dynamic Reconstruction, Gaussian Splatting
TL;DR: A framework that uses graphs as an explicit and sparse motion representation for Gaussian splatting to reconstruct dynamic world, with applications in animation, robot data synthesis, and planning.
Abstract: Gaussian splatting has emerged as a powerful tool for high-fidelity reconstruction of dynamic scenes. However, existing methods primarily rely on implicit motion representations, such as encoding motions into neural networks or per-Gaussian parameters, which makes it difficult to further manipulate the reconstructed motions. This lack of explicit controllability limits existing methods to replaying recorded motions only, which hinders a wider application. To address this, we propose Motion Blender Gaussian Splatting (MB-GS), a novel framework that uses motion graph as an explicit and sparse motion representation. The motion of graph links is propagated to individual Gaussians via dual quaternion skinning, with learnable weight painting functions determining the influence of each link. The motion graphs and 3D Gaussians are jointly optimized from input videos via differentiable rendering. Experiments show that MB-GS achieves state-of-the-art performance on the iPhone dataset while being competitive on HyperNeRF. Additionally, we demonstrate the application potential of our method in animating novel object motions, synthesizing robot demonstrations through motion editing, and predicting robot actions through visual planning.
Supplementary Material: zip
Spotlight: zip
Submission Number: 32
Loading