Prototype Completion With Knowledge Distillation and Gate Recurrent Unit for Few-Shot Classification
Abstract: Few-shot learning(FSL) methods imitate the procedure by which humans learn knowledge from a few samples to identify a novel category. Some FSL methods emphasize obtaining an accurate prototype to precisely measure the similarity with query samples. However, due to the scarcity of data and the noise introduced by background information and other unrelated classes, the prototypes are usually biased and decrease the model’s effectiveness. In this paper, we propose a Knowledge Distillation and GRU-based Prototype Completion Network (KDG-ProComNet) to alleviate the above problem. In KDG-ProComNet, we propose a background discrimination pre-training module to identify the foreground object of samples and use it to obtain more accurate prototypes, thereby reducing the destructive impact of noise information on prototypes. Furthermore, we construct a memory bank to store the prototypes and update them via GRU to rectify biased prototypes and enhance the representativeness of prototypes in the feature space. Extensive experiments are conducted based on miniImageNet and tieredImageNet, and the results demonstrate the outstanding performance of the proposed method.
External IDs:dblp:conf/ijcnn/ZhangWYZ24
Loading