Prototypical Networks for Few-shot Learning

Jake Snell, Kevin Swersky, Richard Zemel

Nov 05, 2016 (modified: Dec 06, 2016) ICLR 2017 conference submission readers: everyone
  • Abstract: A recent approach to few-shot classification called matching networks has demonstrated the benefits of coupling metric learning with a training procedure that mimics test. This approach relies on a complicated fine-tuning procedure and an attention scheme that forms a distribution over all points in the support set, scaling poorly with its size. We propose a more streamlined approach, prototypical networks, that learns a metric space in which few-shot classification can be performed by computing Euclidean distances to prototype representations of each class, rather than individual points. Our method is competitive with state-of-the-art one-shot classification approaches while being much simpler and more scalable with the size of the support set. We empirically demonstrate the performance of our approach on the Omniglot and mini-ImageNet datasets. We further demonstrate that a similar idea can be used for zero-shot learning, where each class is described by a set of attributes, and achieve state-of-the-art results on the Caltech UCSD bird dataset.
  • TL;DR: We learn a metric space in which few-shot classification can be performed by computing Euclidean distances to a single prototype representative of each class.
  • Conflicts: toronto.edu, twitter.com
  • Keywords: Deep learning, Transfer Learning

Loading