AWE: Adaptive weight-space ensembling for few-shot fine-tuningDownload PDF

Published: 04 Mar 2023, Last Modified: 16 May 2023ME-FoMo 2023 PosterReaders: Everyone
Keywords: Weight-space ensembles, CLIP, Few-shot learning
TL;DR: Weight-space interpolation between zero-shot and fine-tuned CLIP yield significant benefits for few-shot learning. We leverage predictable patterns in the optimal interpolation coefficient to approximate it without validation set.
Abstract: In this paper, we introduce a new transfer learning approach called Adaptive Weight-space Ensembling (AWE) that effectively adapts large pre-trained models for downstream tasks with limited fine-tuning data. Traditional transfer learning methods often struggle or become infeasible in scenarios with only a few examples per class, particularly when a validation set is needed. AWE overcomes these challenges by adapting the weight-space ensembling technique, originally developed for large-scale data, to suit few-shot settings without requiring a validation set. By identifying patterns in oracle weight-space ensembling, we create an adaptive ensembling method that can be easily implemented in real-world applications. Our approach outperforms existing state-of-the-art methods by more than 2\% on average in standard few-shot setting benchmarks.
0 Replies

Loading