Beyond Direct Relationships: Exploring Multi-Order Label Pair Dependencies for Knowledge Distillation

Published: 27 Oct 2024, Last Modified: 06 Mar 2025OpenReview Archive Direct UploadEveryoneCC BY 4.0
Abstract: Multi-label image classification is crucial for a wide range of multimedia applications. To address the resource limitation issue, various knowledge distillation (KD) methods have been developed to transfer knowledge from a large network (referred to as the "teacher") to a small network (referred to as the "student"). However, existing KD methods do not explicitly distill the dependencies between labels, which limits the model ability to capture multi-label correlation. Furthermore, although existing methods for multi-label image classification have utilized the second-order label pair dependency (direct dependency between two labels), the high-order label pair dependency, which captures the indirect dependency between two labels, remains unexplored. In this paper, we propose a \textbf{\underline{M}}ulti-Order Label Pair \textbf{\underline{D}}ependencies \textbf{\underline{K}}nowledge \textbf{\underline{D}}istillation (MDKD) framework. MDKD explicitly distills the knowledge to capture multi-order dependencies between labels, including the label pair dependencies from second-order and high-order, thus transferring the insight of label correlations from different perspectives. Extensive experiments on Pascal VOC2007, MSCOCO2014, and NUS-WIDE demonstrate the superior performances of MDKD.
Loading