Abstract: We propose a novel unsupervised method for motion-based 3D part decomposition of articulated objects
using a single monocular video of a dynamic scene. In contrast to existing unsupervised methods relying on optical flow or
tracking techniques, our approach addresses this problem without additional information by leveraging Gaussian splatting
techniques. We generate a series of Gaussians from a monocular video and analyze their relationships to decompose the dynamic
scene into motion-based parts. To decompose dynamic scenes consisting of articulated objects, we design an articulated
deformation field suitable for the movement of articulated objects. And to effectively understand the relationships of Gaussians of different shapes, we propose a 3D reconstruction loss using 3D occupied voxel maps generated from the Gaussians.
Experimental results demonstrate that our method outperforms existing approaches in terms of 3D part decomposition for
articulated objects. More demos and code are available at https://choonsik93.github.io/artnerf/.
Loading