Parametric SDF for Dynamic Surface Reconstruction

16 Sept 2025 (modified: 11 Feb 2026)Submitted to ICLR 2026EveryoneRevisionsBibTeXCC BY 4.0
Keywords: 3D Reconstruction, Dynamic Reconstruction, Mesh Extraction
Abstract: Reconstructing high-fidelity, temporally coherent surfaces of dynamic scenes remains a critical challenge in computer vision. While recent methods excel at novel view synthesis, they often fail to recover accurate geometry, yielding noisy or temporally inconsistent meshes that are suboptimal for downstream applications such as simulation or editing. In this work, we introduce a new paradigm for dynamic surface reconstruction based on a parametric Signed Distance Function ({\nameshort}). Our key insight is to generalize static SDF fields—where each spatial point stores a constant value—into time-dependent parametric curves, where each curve defines a temporally evolving SDF trajectory. Such a parametric SDF modeling provides a principled way to capture complex temporal variations, naturally enforcing smoothness and continuity in shape dynamics. At each timestamp, a static SDF field can be queried from {\nameshort} and converted into an explicit surface mesh via differentiable iso-surfacing. By rendering these meshes with a physically based differentiable renderer, we optimize the underlying parametric curves end-to-end against 2D image observations. Our framework produces high-fidelity, temporally coherent surfaces and inherently disentangles geometry, material, and lighting from multi-view videos. It robustly reconstructs geometry under large-scale motions and resolves appearance ambiguities caused by challenging lighting and occlusions. Experiments on both synthetic and real-world scenes demonstrate that our method achieves state-of-the-art geometric accuracy and temporal consistency, delivering delicate meshes that surpass prior work.
Supplementary Material: zip
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 6455
Loading