Student-Oriented Teacher Knowledge Refinement for Knowledge Distillation

Published: 20 Jul 2024, Last Modified: 21 Jul 2024MM2024 PosterEveryoneRevisionsBibTeXCC BY 4.0
Abstract: Knowledge distillation has become widely recognized for its ability to transfer knowledge from a large teacher network to a compact and more streamlined student network. % Traditional knowledge distillation methods primarily follow a teacher-oriented paradigm that imposes the task of learning the teacher's complex knowledge onto the student network. However, significant disparities in model capacity and architectural design hinder students' comprehension of the complex knowledge imparted by the teacher, resulting in sub-optimal learning results. % This paper introduces a novel approach that emphasizes a student-oriented perspective and refining the teacher's knowledge to better align with the student's needs, thereby improving knowledge transfer effectiveness. % Specifically, we present the Student-Oriented Knowledge Distillation (SoKD), which incorporates a learnable feature augmentation strategy during training to dynamically refine the teacher's knowledge of the student. % Furthermore, we deploy the Distinctive Area Detection Module (DAM) to identify areas of mutual interest between the teacher and student, concentrating knowledge transfer within these critical areas to avoid spreading irrelevant information. This targeted approach ensures a more focused and effective knowledge distillation process. % Our approach, functioning as a plug-in, could be integrated with various knowledge distillation methods. Extensive experimental results demonstrate the efficacy and generalizability of our method.
Primary Subject Area: [Content] Vision and Language
Secondary Subject Area: [Content] Media Interpretation
Relevance To Conference: Knowledge distillation, as one of the crucial methods for model light-weighting, aids in acquiring more effective and smaller-capacity light-weight networks, to better and more swiftly handle multimedia information.
Submission Number: 1176
Loading

OpenReview is a long-term project to advance science through improved peer review with legal nonprofit status. We gratefully acknowledge the support of the OpenReview Sponsors. © 2025 OpenReview