Abstract: Lidar has become a cornerstone sensing modality for
3D vision, especially for large outdoor scenarios and autonomous driving. Conventional lidar sensors are capable
of providing centimeter-accurate distance information by
emitting laser pulses into a scene and measuring the timeof-flight (ToF) of the reflection. However, the polarization
of the received light that depends on the surface orientation and material properties is usually not considered. As
such, the polarization modality has the potential to improve
scene reconstruction beyond distance measurements. In this
work, we introduce a novel long-range polarization wavefront lidar sensor (PolLidar) that modulates the polarization of the emitted and received light. Departing from conventional lidar sensors, PolLidar allows access to the raw
time-resolved polarimetric wavefronts. We leverage polarimetric wavefronts to estimate normals, distance, and material properties in outdoor scenarios with a novel learned
reconstruction method. To train and evaluate the method,
we introduce a simulated and real-world long-range dataset
with paired raw lidar data, ground truth distance, and normal maps. We find that the proposed method improves
normal and distance reconstruction by 53% mean angular
error and 41% mean absolute error compared to existing
shape-from-polarization (SfP) and ToF methods. Code and
data are open-sourced here1
.
Loading