DRAE: Dynamic Retrieval-Augmented Expert Networks for Lifelong Learning and Task Adaptation in Robotics

ACL ARR 2025 February Submission2329 Authors

14 Feb 2025 (modified: 09 May 2025)ACL ARR 2025 February SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Abstract: We present \textbf{Dynamic Retrieval-Augmented Expert Networks (DRAE)}, a novel architecture that integrates Mixture-of-Experts (MoE), Retrieval-Augmented Generation (RAG), and hierarchical reinforcement learning (RL) with ReflexNet-SchemaPlanner-HyperOptima (RSHO) coordination to address the challenges of lifelong learning, catastrophic forgetting, and task adaptation in robotics. DRAE dynamically routes expert models via a sparse MoE gating mechanism, enabling efficient resource allocation while leveraging external knowledge through parametric retrieval (P-RAG) to augment the learning process. We propose a new RL framework with ReflexNet for low-level task execution, SchemaPlanner for symbolic reasoning, and HyperOptima for long-term context modeling, ensuring continuous adaptation and memory retention. Experimental results show that DRAE significantly outperforms baseline approaches in long-term task retention and knowledge reuse, achieving an average task success rate of 82.5\% across a set of dynamic robotic manipulation tasks, compared to 74.2\% for traditional MoE models. Furthermore, DRAE maintains an exceptionally low forgetting rate of 0.1\%, outperforming state-of-the-art methods in catastrophic forgetting mitigation. These results demonstrate the effectiveness of our approach in enabling flexible, scalable, and efficient lifelong learning for robotics.
Paper Type: Long
Research Area: Multimodality and Language Grounding to Vision, Robotics and Beyond
Research Area Keywords: robotics, lifelong learning, dynamic neural network, mixture-of-experts, retrieval-augmented generation, hierarchical reinforcement learning
Contribution Types: Model analysis & interpretability, NLP engineering experiment, Theory
Languages Studied: English
Submission Number: 2329
Loading