Learning Rational Skills for Planning from Demonstrations and InstructionsDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Learning for Planning, Compositional Generalization
Abstract: We present a framework for learning compositional, rational skill models (RatSkills) that support efficient planning and inverse planning for achieving novel goals and recognizing activities. In contrast to directly learning a set of policies that maps states to actions, in RatSkills, we represent each skill as a subgoal and can be executed based on a planning subroutine. RatSkills can be learned by observing expert demonstrations and reading abstract language descriptions of thecorresponding task (e.g.,collect wood then craft a boat then go across the river).The learned subgoal-based representation enables inference of another agent’s intended task from their actions via Bayesian inverse planning. It also supports planning for novel objectives given in the form of either temporal task descriptions or black-box goal tests. We demonstrate through experiments in both discrete and continuous domains that our learning algorithms recover a set of RatSkills by observing and explaining other agents’ movements, and plan efficiently for novel goals by composing learned skills.
19 Replies

Loading