Keywords: Robot Perception, Sensing & Vision, Robot Planning, Navigation, Field Robotics
TL;DR: We present the Long Range Navigator (LRN), a learned system that uses camera data to identify affordable frontiers beyond the range of local maps, extending the planning horizon for robots navigating without prior maps.
Abstract: A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing, which is in the form of a local metric map or local policy with some fixed horizon. A limited planning horizon can often result in myopic decisions leading the robot off course or worse, into very difficult terrain. In this work, we make a key observation that long range navigation only necessitates identifying good frontier directions for planning instead of full map knowledge. To address this, we introduce Long Range Navigator (LRN), which learns to predict ‘affordable’ frontier directions from high-dimensional camera images. LRN is trained entirely on unlabeled egocentric videos, making it scalable and adaptable. In off-road tests on Spot and a large vehicle, LRN reduces human interventions and improves decision speed when integrated into existing navigation stacks.
Supplementary Material: zip
Spotlight: mp4
Submission Number: 696
Loading