Self-Attention Message Passing for Contrastive Few-Shot Learning

Published: 02 Jan 2023, Last Modified: 08 May 2026OpenReview Archive Direct UploadEveryoneCC BY-NC 4.0
Abstract: Humans have a unique ability to learn new represen- tations from just a handful of examples with little to no supervision. Deep learning models, however, require an abundance of data and supervision to perform at a sat- isfactory level. Unsupervised few-shot learning (U-FSL) is the pursuit of bridging this gap between machines and humans. Inspired by the capacity of graph neural net- works (GNNs) in discovering complex inter-sample rela- tionships, we propose a novel self-attention based mes- sage passing contrastive learning approach (coined as SAMP-CLR) for U-FSL pre-training. We also propose an optimal transport (OT) based fine-tuning strategy (we call OpT-Tune) to efficiently induce task awareness into our novel end-to-end unsupervised few-shot classification frame- work (SAMPTransfer). Our extensive experimental re- sults corroborate the efficacy of SAMPTransfer in a vari- ety of downstream few-shot classification scenarios, setting a new state-of-the-art for U-FSL on both miniImageNet and tieredImageNet benchmarks, offering up to 7%+ and 5%+ improvements, respectively. Our further investigations also confirm that SAMPTransfer remains on-par with some supervised baselines on miniImageNet and outperforms all existing U-FSL baselines in a challenging cross-domain sce- nario. Our code can be found in our GitHub repository: https://github.com/ojss/SAMPTransfer/.
Loading