Open Peer Review. Open Publishing. Open Access. Open Discussion. Open Directory. Open Recommendations. Open API. Open Source.
Few-shot learning with simplex
Nov 07, 2017 (modified: Nov 07, 2017)ICLR 2018 Conference Blind Submissionreaders: everyoneShow Bibtex
Abstract:Deep learning has made remarkable achievement in many fields. However, learning
parameters of a neural networks usually needs a large amount of labeled data.
The algorithms of deep learning, therefore, encounter difficulty when applied to
supervised learning where only little data are available. This problem is called
one-shot learning. To address it, we propose a novel algorithm for few-shot learning
using discrete geometry, in the sense that the samples in a class are modeled
as a reduced simplex. The volume of the simplex is used for the measurement of
class scatter. During testing, combined with the test sample and the points in the
class, a new simplex is formed. Then the similarity between the test sample and
the class can be quantized with the ratio of volumes of the new simplex to the original
class simplex. Moreover, we present an approach to constructing simplices
using local regions of feature maps yielded by convolutional neural networks. Experiments
on Omniglot and miniImageNet verify the superiority of our simplex
algorithm on few-shot learning.
TL;DR:A simplex-based geometric method is proposed to cope with few-shot learning problems.
Keywords:one-shot learning, few-shot learning, deep learning, simplex
Enter your feedback below and we'll get back to you as soon as possible.