Semi-Supervised Few-Shot Learning Via Dependency Maximization and Instance Discriminant Analysis

Published: 01 Jan 2023, Last Modified: 19 May 2025J. Signal Process. Syst. 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: We study the few-shot learning (FSL) problem, where a model learns to recognize new objects with extremely few labeled training data per category. Most of previous FSL approaches resort to the meta-learning paradigm, where the model accumulates inductive bias through learning many training tasks so as to solve a new unseen few-shot task. In contrast, we propose a simple semi-supervised FSL approach to exploit unlabeled data accompanying the few-shot task for improving few-shot performance. (i) Firstly, we propose a Dependency Maximization method based on the Hilbert-Schmidt norm of the cross-covariance operator, which maximizes the statistical dependency between the embedded features of those unlabeled data and their label predictions, together with the supervised loss over the support set. We then use the obtained model to infer the pseudo-labels of the unlabeled data. (ii) Furthermore, we propose an Instance Discriminant Analysis to evaluate the credibility of each pseudo-labeled example and select the most faithful ones into an augmented support set to retrain the model as in the first step. We iterate the above process until the pseudo-labels of the unlabeled set become stable. Our experiments demonstrate that the proposed method outperforms previous state-of-the-art methods on four widely used few-shot classification benchmarks, including mini-ImageNet, tiered-ImageNet, CUB, CIFARFS, as well as the standard few-shot semantic segmentation benchmark PASCAL-5\(^{i}\).
Loading