Deep n-shot transfer learning for tactile material classification with a flexible pressure-sensitive skin

Published: 12 Aug 2019, Last Modified: 24 Sept 2024OpenReview Archive Direct UploadEveryoneCC BY-NC-ND 4.0
Abstract: n-shot learning, i.e., learning a classifier from only few or even one training samples per class, is the ultimate goal in minimizing the cost of sample acquisition. This is esp. important for active sensing tasks like tactile material classification. Achieving high classification accuracy from only few samples is typically possible only when pre-knowledge is used. In n-shot transfer learning, knowledge from pre-training on a large knowledge set with many classes and samples per class has to be transferred to support the training for a given task set with only few samples per new class. In this paper, we show for the first time that deep end-to-end transfer learning is feasible for tactile material classification. Based on the previously presented (TactNet-II) [1], a deep convolutional neural network (CNN) which reaches superhuman tactile classification performance, we adapt state-of-the art deep transfer learning methods. We evaluate the resulting deep n-shot learning methods with a publicly available tactile material data set with 36 materials [1] in a 6-way n-shot learning task with 30 materials in the knowledge set. In 1-shot learning, our deep transfer learning method reaches 75.5% classification accuracy and in 10-shot more than 90%, outperforming classification without knowledge transfer by more than 40%. This results in an up to 15 time reduction in the number of samples needed to reach a desired accuracy level. We also provide insights of the inner workings of the derived deep transfer learning methods.
Loading