Hierarchical Knowledge Propagation and Distillation for Few-Shot Learning

Published: 01 Jan 2023, Last Modified: 19 Feb 2025Neural Networks 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We highlight the significance of the inductive Few-Shot Learning in the real-world settings.•The existing inductive FSL methods usually ignore the relations between sample-level and class-level representations.•The proposed HKPD can leverage these relations, designed for the inductive setting.•A self-distillation module is designed to further improve the performance.
Loading