Keywords: Robotics, Imitation Learning
Abstract: The general-purpose robots need to continuously acquire new skills in lifelong spans without revisiting past experiences, known as Rehearsal-free Lifelong Learning, which remains significantly challenging. Recent advances learn a separate adapter along pretrained policy for each new skill to address catastrophic forgetting problem, ignoring the shared knowledge between old skills and new ones. To tackle these issues, we propose Primitive-level Skill Prompt Learning (PSPL), to achieve lifelong robot manipulation via reusable and extensible primitives. Within our two stage learning scheme, we first learn a set of prefix skill prompts to extract shared knowledge through multi-skills pre-training stage, where motion-aware skill prompts are learned to capture semantic and motion shared primitives across different skills. Secondly, when acquiring new skills in lifelong span, new prefix skill prompts are added and learned via cross-attention between prefix prompts of old skills, boosting the new skills learning via shared knowledge transfer. For evaluation, we construct a large-scale skill dataset and conduct extensive experiments in both simulation and real-world tasks, demonstrating PSPL's superior performance over state-of-the-art methods. Code and dataset will be released upon acceptance.
Primary Area: applications to robotics, autonomy, planning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3048
Loading