Abstract: 3D surface reconstruction is essential across applications
of virtual reality, robotics, and mobile scanning. However,
RGB-based reconstruction often fails in low-texture, low-
light, and low-albedo scenes. Handheld LiDARs, now com-
mon on mobile devices, aim to address these challenges
by capturing depth information from time-of-flight mea-
surements of a coarse grid of projected dots. Yet, these
sparse LiDARs struggle with scene coverage on limited in-
put views, leaving large gaps in depth information. In this
work, we propose using an alternative class of “blurred”
LiDAR that emits a diffuse flash, greatly improving scene
coverage but introducing spatial ambiguity from mixed
time-of-flight measurements across a wide field of view. To
handle these ambiguities, we propose leveraging the com-
plementary strengths of diffuse LiDAR with RGB. We in-
troduce a Gaussian surfel-based rendering framework with
a scene-adaptive loss function that dynamically balances
RGB and diffuse LiDAR signals. We demonstrate that, sur-
prisingly, diffuse LiDAR can outperform traditional sparse
LiDAR, enabling robust 3D scanning with accurate color
and geometry estimation in challenging environments.
Loading