Keywords: Interactive learning, imitation learning, human-robot collaboration
TL;DR: A human intention detection method which allows the user to correct learned skills by physically interacting with the robot on any point of its body.
Abstract: Human-Robot interaction (HRI) is a key requirement to allow robotic systems to cooperate with humans in various daily scenarios. There are different methods for interacting with a robot, but physical contact offers the human the best sense of collaboration. However, to make the best use of this input, the robot's cognitive abilities need to distinguish which of the contacts it detects through force-torque sensing are human intentions and which are task-related environmental contacts. In this paper, we propose an energy-tank based method that detects human intention in three different Degrees of Freedom (DoF) of Cartesian space, allowing the human operator to correct or provide input to a skill in the desired direction. During the interaction, a key role is played by the controller, which is responsible for the robot's compliant behavior. In our novel approach, we modulate the stiffness and reference force of an impedance controller according to an intention index, making the collaboration smoother and more sensitive. We demonstrate our approach with a learned-by-demonstration pick-and-place manufacturing task on a torque-controlled, 7-DoF robot. Thanks to the force and joint torque capabilities of the robot, we exploit an external force observer to allow the user to interact with any part of the robot's body instead of just the end-effector, as is usually the case. Overall, the user is able to interact naturally and intuitively with the system, adding via-points in different DoF to the learned skill through a Kernelized Motion Primitives (KMP) model.
Submission Number: 7
Loading