Distilling Image Classifiers in Object DetectorsDownload PDF

Published: 09 Nov 2021, Last Modified: 14 Jul 2024NeurIPS 2021 PosterReaders: Everyone
Keywords: knowledge distillation, classifier-to-detector, across tasks, object detection
Abstract: Knowledge distillation constitutes a simple yet effective way to improve the performance of a compact student network by exploiting the knowledge of a more powerful teacher. Nevertheless, the knowledge distillation literature remains limited to the scenario where the student and the teacher tackle the same task. Here, we investigate the problem of transferring knowledge not only across architectures but also across tasks. To this end, we study the case of object detection and, instead of following the standard detector-to-detector distillation approach, introduce a classifier-to-detector knowledge transfer framework. In particular, we propose strategies to exploit the classification teacher to improve both the detector's recognition accuracy and localization performance. Our experiments on several detectors with different backbones demonstrate the effectiveness of our approach, allowing us to outperform the state-of-the-art detector-to-detector distillation methods.
Code Of Conduct: I certify that all co-authors of this work have read and commit to adhering to the NeurIPS Statement on Ethics, Fairness, Inclusivity, and Code of Conduct.
TL;DR: We introduce a classifier-to-detector distillation framework to transfer knowledge across both architectures and tasks, improving the performance of compact student detectors.
Supplementary Material: pdf
Code: https://github.com/NVlabs/DICOD
Community Implementations: [![CatalyzeX](/images/catalyzex_icon.svg) 3 code implementations](https://www.catalyzex.com/paper/distilling-image-classifiers-in-object/code)
13 Replies

Loading