Abstract: Generalized Few-Shot Learning (GFSL) aims to recognize novel classes with limited training samples without forgetting knowledge of auxiliary data (base classes). Most current approaches re-engage the base classes after initial training to balance the predictive bias between the base and novel classes. However, re-using the auxiliary data might not always be possible due to privacy or ethical constraints. Consequently, the zero-base GFSL paradigm emerges, where models trained on the base classes are directly fine-tuned on the novel classes without revisiting the auxiliary data, avoiding the re-balancing of prediction biases. We believe that solving this paradigm relies on a critical yet often overlooked issue: feature overlap between the base and novel classes in the embedding space. To tackle this issue, we propose the Concept Agent Network, a novel framework that interprets visual features as affinity features, thereby effectively diminishing feature overlap by aggregating feature embeddings of the novel classes according to their similarity with the base classes. Additionally, we present the Concept Catena Generator, which creates multiple concepts per base class, improving understanding of the feature distribution of the base classes and clarifying the relationships between the base and novel concepts. To prevent the catastrophic forgetting of the base classes when adapting to the novel ones, we propose an Active Training Regularization strategy, promoting the preservation of base class knowledge. Extensive experimental results on two benchmarks, mini-ImageNet and tiered-ImageNet, have demonstrated the effectiveness of our framework. The potential utility of our framework spans several real-world applications, including autonomous driving, medical image analysis, and real-time surveillance, where the ability to rapidly learn from a few examples without forgetting previously acquired knowledge is critical.
Loading