Keywords: Object Detection, Knowledge Distillation, Uncertainty Estimation
Abstract: Knowledge distillation (KD) has become a fundamental technique for model compression in object detection tasks. The data noise and training randomness may cause the knowledge of the teacher model to be unreliable, referred to as knowledge uncertainty. Existing methods only transfer this knowledge and could limit the student's ability to capture and understand the potential ``dark knowledge''. In this work, we introduce a new strategy that explicitly incorporates knowledge uncertainty, named Uncertainty-Driven Knowledge Extraction and Transfer (UET). Given that the knowledge distribution is unknown and high-dimensional in practice, we introduce a simple yet effective sampling method with Monte Carlo dropout (MC dropout) to estimate the teacher’s knowledge uncertainty. Leveraging information theory, we integrate knowledge uncertainty into the conventional KD process, allowing the student model to benefit from knowledge diversity. UET is a plug-and-play method that integrates seamlessly with existing distillation techniques. We validate our approach through comprehensive experiments across various distillation strategies, detectors, and backbones. Specifically, UET achieves state-of-the-art results, with a ResNet50-based GFL detector obtaining 44.1\% mAP on the COCO dataset—surpassing baseline performance by 3.9\%.
Primary Area: unsupervised, self-supervised, semi-supervised, and supervised representation learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 3342
Loading