Keywords: Robot Perception, Sensing & Vision, Robot Planning, Navigation, Field Robotics
TL;DR: We present the Long Range Navigator (LRN), a learned system that uses camera data to identify affordable frontiers beyond the range of local maps, extending the planning horizon for robots navigating without prior maps.
Abstract: A robot navigating an outdoor environment with no prior knowledge of the space must rely on its local sensing, in the form of a local metric map or local policy with some fixed horizon. A limited planning horizon can often result in myopic decisions leading the robot off course or worse, into very difficult terrain. In this work, we make a key observation that effective long range navigation only necessitates identifying good frontier directions for planning instead of full map knowledge. To address this, we introduce Long Range Navigator (\texttt{LRN}), which learns to predict ‘affordable’ frontier directions from camera images. \texttt{LRN} is trained entirely on unlabeled egocentric videos, making it scalable and adaptable. In off-road tests on Spot and a large vehicle, \texttt{LRN} reduces human interventions and improves decision speed when integrated into existing navigation stacks.
Submission Number: 21
Loading