Perceive With Confidence: Statistical Safety Assurances for Navigation with Learning-Based Perception
Keywords: Uncertainty quantification, occupancy prediction, robot navigation
TL;DR: An uncertainty quantification framework for perception-based navigation tasks that provides formal assurances for end-to-end safety.
Abstract: Rapid advances in perception have enabled large pre-trained models to be used out of the box for transforming high-dimensional, noisy, and partial observations of the world into rich occupancy representations. However, the reliability of these models and consequently their safe integration onto robots remains unknown when deployed in environments unseen during training. In this work, we address this challenge by rigorously quantifying the uncertainty of pre-trained perception systems for object detection via a novel calibration technique based on conformal prediction. Crucially, this procedure guarantees robustness to distribution shifts in states when perceptual outputs are used in conjunction with a planner. As a result, the calibrated perception system can be used in combination with any safe planner to provide an end-to-end statistical assurance on safety in unseen environments. We evaluate the resulting approach, Perceive with Confidence (PwC), with experiments in simulation and on hardware where a quadruped robot navigates through previously unseen indoor, static environments. These experiments validate the safety assurances for obstacle avoidance provided by PwC and demonstrate up to 40% improvements in empirical safety compared to baselines.
Supplementary Material: zip
Video: https://youtu.be/sztc832hw0c?feature=shared
Website: https://perceive-with-confidence.github.io/
Code: https://github.com/irom-lab/perception-guarantees
Publication Agreement: pdf
Student Paper: no
Spotlight Video: mp4
Submission Number: 470
Loading