Wait, That Feels Familiar: Learning to Extrapolate Human Preferences for Preference-Aligned Path PlanningDownload PDF

Published: 07 May 2023, Last Modified: 08 May 2023ICRA-23 Workshop on Pretraining4Robotics LightningReaders: Everyone
Keywords: Vision-Based Navigation, Learning from Experience
Abstract: Autonomous mobility tasks such as last-mile delivery require reasoning about operator-indicated preferences over different types of terrain to ensure robot safety and mission success. However, coping with out of distribution data, such as encountering novel or visually distinct terrains due to lighting variations, remains a fundamental problem in visual navigation. Existing solutions either require labor-intensive manual data recollection and labeling or use hand-coded reward functions that may not align with operator preferences. In this work, we posit that in many situations, operator preferences over novel terrains can be inferred by relating inertial-tactile observations of novel terrains to known terrains experienced by the robot. Leveraging this insight, we introduce "Preference Adaptation for Terrain-awarE Robot Navigation (PATERN), a novel framework for extrapolating operator terrain preferences for visual navigation. PATERN learns an inertial-tactile representation space from the robot's experience and uses nearest-neighbor search in this space to estimate operator preferences over novel terrains. Through physical robot experiments in off-road environments, we evaluate PATERN's adaptability to novel terrains and challenging lighting conditions, and in comparison to baseline approaches, we find that PATERN successfully generalizes to novel terrains and varied lighting conditions while being aligned with operator preferences.
0 Replies

Loading