Abstract: Highlights•Integrating task-related feature transformation and class significance learning.•Taking all samples in the support and query sets as global context in self-attention.•Fusing global context information to learn task-related features for FSL.•Exploiting pseudolabeled query samples with high confidence to rectify the prototypes.•Learning more unbiased and discriminative class prototypes for FSL.
Loading