Empirical Study: Monocular Depth Estimation from RGB, NIR, Thermal Image in Adverse Weather Conditions
Abstract: Robust spatial understanding is one fundamental condition for safety-aware autonomous driving against adverse weather and lighting conditions, such as rain, fog, haze, snow, and low-light environments. Therefore, numerous autonomous vehicle platforms adopt various sensor modalities to ensure safety and reliability (e.g., RGB camera, NIR camera, thermal camera, LiDAR, and RADAR). Among them, the RGB camera is a commonly adopted complementary sensor because it can provide dense spatial understanding ability compared to LiDAR and RADAR. However, RGB camera is known to be vulnerable to changes in lighting and weather conditions. In this paper, we empirically analyze the robustness of monocular depth estimation from RGB image in diverse seasonal, weather, and lighting conditions. Also, we investigate the robustness of depth estimation from NIR and thermal images in the same condition to find which sensor is robust to environmental changes and capable of dense spatial understanding even in extreme conditions. As a result, we found thermal cameras can provide reliable and robust dense spatial understanding against diverse seasonal, weather, and lighting condition changes.
Loading