Keywords: Learning Abstractions, Mobile Manipulation
TL;DR: We present an efficient approach to learning transferable abstractions for task planning with mobile manipulators.
Abstract: We address the problem of efficiently learning high-level abstractions for task-level robot planning. Existing approaches require large amounts of data and fail to generalize learned abstractions to new environments. To address this, we propose to exploit the independence between spatial and non-spatial state variables in the preconditions of manipulation and navigation skills, mirroring the manipulation-navigation split in robotics research. Given a collection of portable manipulation abstractions (i.e., object-centric manipulation skills paired with matching symbolic representations), we derive an algorithm to automatically generate navigation abstractions that support mobile manipulation planning in a novel environment. We apply our approach to simulated data in AI2Thor and on real robot hardware with a coffee preparation task, efficiently generating plannable representations for mobile manipulators in just a few minutes of robot time, significantly outperforming state-of-the-art baselines.
Student First Author: yes
Supplementary Material: zip
Instructions: I have read the instructions for authors (https://corl2023.org/instructions-for-authors/)
Video: https://www.youtube.com/watch?v=mnete3JteqE&ab_channel=EricRosen
Website: https://github.com/ericrosenbrown/aosm_experiments
Code: https://github.com/ericrosenbrown/aosm_experiments
Publication Agreement: pdf
Poster Spotlight Video: mp4
12 Replies
Loading