Few-shot learning based on prototype rectification with a self-attention mechanism

Published: 01 Jan 2024, Last Modified: 08 Feb 2025Expert Syst. Appl. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Integrating task-related feature transformation and class significance learning.•Taking all samples in the support and query sets as global context in self-attention.•Fusing global context information to learn task-related features for FSL.•Exploiting pseudolabeled query samples with high confidence to rectify the prototypes.•Learning more unbiased and discriminative class prototypes for FSL.
Loading