MVGSR: Multi-View Consistency Gaussian Splatting for Robust Surface Reconstruction

ICLR 2026 Conference Submission7377 Authors

16 Sept 2025 (modified: 08 Oct 2025)ICLR 2026 Conference SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Gaussian Splatting
Abstract: 3D Gaussian Splatting (3DGS) has recently emerged as a powerful approach for high-quality dense surface reconstruction of unknown scenes. However, existing methods are limited by the assumption of static environments. In practice, they often fail in everyday scenarios with dynamic objects and transient distractors that resulted in floating artifacts, geometric distortions, and view-dependent appearance errors in 3D reconstructed models. We propose a robust surface reconstruction framework that leverages Gaussian models together with a heuristics-guided distractor masking strategy. Unlike prior methods that rely on MLP-based uncertainty modeling for distractor segmentation, our approach uses multi-view feature consistency to separate distractors from static content. This allows us to obtain precise distractor masks in the early stage of training. To further improve reconstruction, we introduce a pruning mechanism that evaluates the visibility of each Gaussian across views. Specifically, it resets the transmittance of unreliable points and thus suppresses floating artifacts to yield a more compact representation while preserving rendering quality. Finally, we design a multi-view consistency loss that enforces both structural and color coherence across views to improve the fidelity of Gaussian splats in distractor-heavy scenes. Extensive experiments demonstrate that our method achieves state-of-the-art geometric accuracy and rendering fidelity while remaining robust in dynamic and cluttered environments. The code will be made publicly available on paper acceptance.
Primary Area: applications to computer vision, audio, language, and other modalities
Submission Number: 7377
Loading