Complex Skill Acquisition through Simple Skill Imitation LearningDownload PDF

Anonymous

23 Oct 2020 (modified: 05 May 2023)Submitted to NeurIPS 2020 Deep Inverse WorkshopReaders: Everyone
Keywords: imitation learning, generative model, transfer learning, hierarchical learning, concurrent learning
TL;DR: We propose a new algorithm that trains neural network policies on simple, easy-to-learn skills in order to cultivate latent spaces that accelerate imitation learning of complex, hard-to-learn skills.
Abstract: Humans often think of complex tasks as combinations of simpler subtasks in order to learn those complex tasks more efficiently. For example, a backflip could be considered a combination of four subskills: jumping, tucking knees, rolling backwards, and thrusting arms downwards. Motivated by this line of reasoning, we propose a new algorithm that trains neural network policies on simple, easy-to-learn skills in order to cultivate latent spaces that accelerate imitation learning of complex, hard-to-learn skills. We focus on the case in which the complex task comprises a concurrent (and possibly sequential) combination of the simpler subtasks, and therefore our algorithm can be seen as a novel approach to concurrent hierarchical imitation learning. We evaluate our algorithm on difficult tasks in a high-dimensional environment and see that it consistently outperforms a state-of-the-art baseline in training speed and overall performance.
0 Replies

Loading