Abstract: We present a novel approach to real-time dense visual SLAM. Our system is capable of capturing comprehensive dense glob-
ally consistent surfel-based maps of room scale environments and beyond explored using an RGB-D camera in an incremental
online fashion, without pose graph optimisation or any post-processing steps. This is accomplished by using dense frame-to-
model camera tracking and windowed surfel-based fusion coupled with frequent model refinement through non-rigid surface
deformations. Our approach applies local model-to-model surface loop closure optimisations as often as possible to stay close
to the mode of the map distribution, while utilising global loop closure to recover from arbitrary drift and maintain global con-
sistency. In the spirit of improving map quality as well as tracking accuracy and robustness, we furthermore explore a novel
approach to real-time discrete light source detection. This technique is capable of detecting numerous light sources in indoor
environments in real-time as a user handheld camera explores the scene. Absolutely no prior information about the scene or
number of light sources is required. By making a small set of simple assumptions about the appearance properties of the scene
our method can incrementally estimate both the quantity and location of multiple light sources in the environment in an online
fashion. Our results demonstrate that our technique functions well in many different environments and lighting configurations.
We show that this enables (a) more realistic augmented reality (AR) rendering; (b) a richer understanding of the scene beyond
pure geometry and; (c) more accurate and robust photometric tracking.
0 Replies
Loading