Abstract: We present a dense simultaneous localization and mapping (SLAM) method that uses 3D Gaussians as a scene representation.
Our approach enables interactive-time reconstruction and photo-realistic rendering from real-world single-camera RGBD videos.
To this end, we propose a novel effective strategy for seeding new Gaussians for newly explored areas and their effective online
optimization that is independent of the scene size and thus scalable to larger scenes. This is achieved by organizing the scene
into sub-maps which are independently optimized and do not need to be kept in memory. We further accomplish frame-to-model
camera tracking by minimizing photometric and geometric losses between the input and rendered frames.
The Gaussian representation allows for high-quality photo-realistic real-time rendering of real-world scenes.
Evaluation on synthetic and real-world datasets demonstrates competitive or superior performance in mapping, tracking,
and rendering compared to existing neural dense SLAM methods.
Submission Length: Regular submission (no more than 12 pages of main content)
Changes Since Last Submission: 1. Fixed typos
2. Added distributed NeRF references
Assigned Action Editor: ~Matthew_Walter1
Submission Number: 3028
Loading