MGML: Momentum group meta-learning for few-shot image classification

Published: 01 Jan 2022, Last Modified: 15 May 2025Neurocomputing 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•GML (Group Meta-Learning) is proposed to effectively alleviate the problem that low-quality samples have unfavourable impact on the training effect under few-shot conditions, thereby improving the performance of the model.•Momentum update strategy is introduced to few-shot learning for the first time to effectively improve the generalization and stability of the model, and an adaptive momentum coefficient is designed to form AMS (Adaptive Momentum Smoothing) to further improve the training efficiency.•We propose MGML (Momentum Group Meta-Learning) by combining GML and AMS. MGML not only improves the accuracy of Meta-Learning Baseline, but also has good mobility. It can be inserted into previous state-of-the-art methods and get consistent performance improvements.
Loading