Abstract: In this paper, an efficient skill learning framework is proposed for robotic insertion, based on one-shot demonstration and
reinforcement learning. First, the robot action is composed of two parts: expert action and refinement action. A force Jacobian matrix is
calibrated with only one demonstration, based on which stable and safe expert action can be generated. The deep deterministic policy
gradients (DDPG) method is employed to learn the refinement action, which aims to improve the assembly efficiency. Second, an episode-step exploration strategy is developed, which uses the expert action as a benchmark and adjusts the exploration intensity dynamically. A safety-efficiency reward function is designed for the compliant insertion. Third, to improve the adaptability with different components, a skill saving and selection mechanism is proposed. Several typical components are used to train the skill models. And the
trained models and force Jacobian matrices are saved in a skill pool. Given a new component, the most appropriate model is selected
from the skill pool according to the force Jacobian matrix and directly used to accomplish insertion tasks. Fourth, a simulation environment is established under the guidance of the force Jacobian matrix, which avoids tedious training process on real robotic systems. Simulation and experiments are conducted to validate the effectiveness of the proposed methods.
0 Replies
Loading