Few-Shot Semi-supervised Learning From Demonstration for Generalisation of Force-Based Motor Skills Across Objects PropertiesDownload PDF

Published: 15 May 2023, Last Modified: 15 May 2023Embracing Contacts 2023 PosterReaders: Everyone
Keywords: manipulation, Learning from Demonstration, haptics, tactile, sensorimotor control, deep learning, unsupervised learning, self-supervised learning, VAE, sim2real
TL;DR: We propose pre-training of a haptic representation model to enable learning of generalisable motor skills through Learning from Demonstration in a few shots, and validated it with a wiping task on a real robot.
Abstract: In many manipulation tasks, force feedback plays an essential role in adapting the motion to the physical properties of the manipulated object. The existing Learning from Demonstration approaches require a large number of demonstrations to generalise the learned motor skills to manipulate objects with unknown properties, however, collecting demonstrations is expensive in time and human effort. Therefore, we aim to learn to adapt motion according to object properties from a small number of demonstrations by utilising a large amount of unsupervised data, which is less informative about the task but less expensive to collect. We propose to decouple the haptic representation model from the motion generation model and enable pre-training of the haptic representation model through self-supervised learning on unsupervised haptic data. We validated on the wiping task using wiping tools with different stiffness and surface friction. Our results suggest that pre-training of the haptic model leads to force profiles that are closer to those demonstrated during adaptive wiping using sponges with unseen stiffness and friction. The sim2real transfer of the haptic representation model pre-trained on simulation data in learning downstream tasks on a real robot was also evaluated.
0 Replies

Loading