A Tactile-Based Framework for Active Object Learning and Discrimination using Multimodal Robotic Skin

Abstract: In this letter, we propose a complete probabilistic tactile-based framework to enable robots to autonomously explore unknown workspaces and recognize objects based on their physical properties. Our framework consists of three components: 1) an active pretouch strategy to efficiently explore unknown workspaces; 2) an active touch learning method to learn about unknown objects based on their physical properties (surface texture, stiffness, and thermal conductivity) with the least number of training samples; and 3) an active touch algorithm for object discrimination, which selects the most informative exploratory action to apply to the object, so that the robot can efficiently distinguish between objects with a few number of actions. Our proposed framework was experimentally evaluated using a robotic arm equipped with multimodal artificial skin. The robot with the active pretouch method reduced the uncertainty of the workspace up to 30% and 70% compared to uniform and random strategies, respectively. By means of the active touch learning algorithm, the robot used 50% fewer samples to achieve the same learning accuracy than the baseline methods. By taking advantage of the prior knowledge obtained during the learning process, the robot actively discriminated objects with an improvement of 10% recognition accuracy compare to the random action selection approach.
0 Replies
Loading