Velocity-Centric 4D Gaussian Splatting for Physical Realistic Dynamic Rendering

ICLR 2026 Conference Submission18167 Authors

19 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Novel View Synthesis, Dynamic Scene, Gaussian Splatting
Abstract: Synthesizing novel views of dynamic scenes has long been a challenge in computer vision. While existing rendering methods have made progress with static scenes, they struggle to maintain temporal and spatial consistency, as well as physical plausibility, in dynamic scenes, often resulting in jerky motion and unrealistic physical effects. To address this, we propose Phys4DGS, a physically grounded framework that achieves high-fidelity and temporally coherent dynamic scene rendering. Phys4DGS introduces a velocity-aware physical consistency regularization that supervises motion across three complementary representations: intrinsic Gaussian motion attributes, geometric motion, and photometric motion. Furthermore, we introduce unit-time physical interval regularization, which stabilizes motion over time, ensuring continuous dynamics and temporal smoothness. Extensive experiments demonstrate that Phys4DGS outperforms leading methods on dynamic scene rendering, improving PSNR by 7.58 dB, reducing LPIPS by 80.00%, cutting training time by 72.22%, and increasing FPS by 175.48%, which ensures physically realistic, temporally consistent motion.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 18167
Loading