Spacetime Gaussian Feature Splatting for Real-Time Dynamic View Synthesis
Abstract: Novel view synthesis of dynamic scenes has been an intriguing yet challenging problem. Despite recent advancements, simultaneously achieving high-resolution photorealistic results, real-time rendering, and compact storage
remains a formidable task. To address these challenges,
we propose Spacetime Gaussian Feature Splatting as a
novel dynamic scene representation, composed of three pivotal components. First, we formulate expressive Spacetime Gaussians by enhancing 3D Gaussians with temporal opacity and parametric motion/rotation. This enables
Spacetime Gaussians to capture static, dynamic, as well
as transient content within a scene. Second, we introduce splatted feature rendering, which replaces spherical harmonics with neural features. These features facilitate the modeling of view- and time-dependent appearance while maintaining small size. Third, we leverage
the guidance of training error and coarse depth to sample new Gaussians in areas that are challenging to converge with existing pipelines. Experiments on several established real-world datasets demonstrate that our method
achieves state-of-the-art rendering quality and speed, while
retaining compact storage. At 8K resolution, our liteversion model can render at 60 FPS on an Nvidia RTX 4090
GPU. Our code is available at https://github.com/oppo-usresearch/SpacetimeGaussians.
Loading