Abstract: With recent advancements in CV (computer vision) and AI (Artificial Intelligence) technologies, pointing gesture is becoming an emerging trend for human-robot interaction. Its intuitive and deictic nature makes it an ideal way for giving commands, especially referring spatial information to the robots. In this paper, we propose an augmented pointing gesture estimation method to enable richer and programmable instructions to be given to the robots. We propose five pointing gestures and demonstrate the idea using a collaborative robot with a multi-finger robotic gripper. Experiments are designed and conducted to test the pointing accuracy in space and in gesture estimation. The results show that our proposed method can achieve a mean drift of 8.3 cm and an estimation accuracy of 94.08%.
0 Replies
Loading