Few-shot class incremental learning via prompt transfer and knowledge distillation

Published: 01 Jan 2024, Last Modified: 13 Nov 2024Image Vis. Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•Few-shot incremental learning is the ability of a model to learn incrementally with limited data without forgetting.•The challenge of incremental learning models is catastrophic forgetting.•Prompt-guided knowledge distillation is used to minimize catastrophic forgetting.•Attention-based knowledge distillation strategy that identifies similar features between the student and teacher models.
Loading