A New Similarity-Based Relational Knowledge Distillation Method

Published: 01 Jan 2024, Last Modified: 30 Oct 2024ICASSP 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: The previous relation-based knowledge distillation methods tend to construct global similarity relationship matrix in a mini-batch while ignoring the knowledge of neighbourhood relationship. In this paper, we propose a new similarity-based relational knowledge distillation method that transfers neighbourhood relationship knowledge by selecting K-nearest neighbours for each sample. Our method consists of two components: Neighbourhood Feature Relationship Distillation and Neighbourhood Logits Relationship Distillation. We perform extensive experiments on CIFAR100 and Tiny ImageNet classification datasets and show that our method outperforms the state-of-the-art knowledge distillation methods. Our code is available at: https://github.com/xinxiaoxiaomeng/NRKD.git.
Loading