everyone
since 04 Oct 2024">EveryoneRevisionsBibTeXCC BY 4.0
This paper introduces 360-InpaintR, the first reference-based 360° inpainting method for 3D Gaussian Splatting (3DGS) scenes, particularly designed for unbounded environments. Our method leverages multi-view information and introduces an improved unseen mask generation technique to address the challenges of view consistency and geometric plausibility in 360° scenes. We effectively integrate reference-guided 3D inpainting with diffusion priors to ensure consistent results across diverse viewpoints. To facilitate research in this area, we present a new 360° inpainting dataset and capture protocol, enabling high-quality novel view synthesis and quantitative evaluations of modified scenes. Experimental results demonstrate that 360-InpaintR performs favorably against existing methods in both quantitative metrics and qualitative assessments, particularly in complex scenes with large view variations.