uPLAM: Robust Panoptic Localization and Mapping Leveraging Perception Uncertainties

Published: 09 Apr 2024, Last Modified: 26 Apr 2024ICRA 2024: Back to the Future SpotlightEveryoneRevisionsBibTeXCC BY 4.0
Keywords: uncertainty estimation, semantic mapping, localization, panoptic segmentation
TL;DR: We introduce uncertainty-aware panoptic localization and mapping (uPLAM), leveraging perception uncertainties to effectively fuse CNN-based perception predictions with classical probabilistic approaches for robust localization and mapping.
Abstract: The availability of a robust map-based localization system is essential for the operation of many autonomously navigating vehicles. Since uncertainty is an inevitable part of perception, it is beneficial for the robustness of the robot to consider it in typical downstream tasks of navigation stacks. In particular, localization and mapping methods, which in modern systems often employ convolutional neural networks (CNNs) for perception tasks, require proper uncertainty estimates. In this work, we present uncertainty-aware Panoptic Localization and Mapping (uPLAM), which employs pixel-wise uncertainty estimates for panoptic CNNs as a bridge to fuse modern perception with classical probabilistic localization and mapping approaches. Beyond the perception, we introduce an uncertainty-based map aggregation technique to create accurate panoptic maps, containing surface semantics and landmark instances. Moreover, we provide cell-wise map uncertainties, and present a particle filter-based localization method that employs perception uncertainties. Extensive evaluations show that our proposed incorporation of uncertainties leads to more accurate maps with reliable uncertainty estimates and improved localization accuracy. Additionally, we present the Freiburg Panoptic Driving dataset for evaluating panoptic mapping and localization methods. We make our code and dataset available at: \url{http://uplam.cs.uni-freiburg.de}
Submission Number: 3
Loading