Fast Task Adaptation for Few-Shot Learning

Anonymous

Sep 25, 2019 Blind Submission readers: everyone Show Bibtex
  • Keywords: Few-Shot Learning, Metric-Softmax Loss, Fast Task Adaptation
  • TL;DR: We propose a novel Metric-Softmax loss to learn task-agnostic feature and adapt the classifier to each few-shot task using a task-adaptive transformation.
  • Abstract: Few-shot classification is a challenging task due to the scarcity of training examples for each class. The key lies in generalization of prior knowledge learned from large-scale base classes and fast adaptation of the classifier to novel classes. In this paper, we introduce a two-stage framework. In the first stage, we attempt to learn task-agnostic feature on base data with a novel Metric-Softmax loss. The Metric-Softmax loss is trained against the whole label set and learns more discriminative feature than episodic training. Besides, the Metric-Softmax classifier can be applied to base and novel classes in a consistent manner, which is critical for the generalizability of the learned feature. In the second stage, we design a task-adaptive transformation which adapts the classifier to each few-shot setting very fast within a few tuning epochs. Compared with existing fine-tuning scheme, the scarce examples of novel classes are exploited more effectively. Experiments show that our approach outperforms current state-of-the-arts by a large margin on the commonly used mini-ImageNet and CUB-200-2011 benchmarks.
0 Replies

Loading