Multi-expert collaboration: Enhancing heterogeneous knowledge independence and alignment in knowledge distillation
Keywords: knowledge distillation, heterogeneous knowledge, multi-teacher knowledge distillation, independence and alignment
Abstract: Heterogeneous multi-teacher Knowledge distillation attempt to learn a versatile student neural network from multiple pre-trained heterogeneous teachers. But current methods face issues with a lack of independence and alignment in heterogeneous knowledge. To address this issue, we propose a novel method called Multi-Expert Collaboration (MEC). Our approach aggregates multiple expert classifiers within the student model, replacing the conventional single-head architecture. By ensuring that each expert's independent classifier operates without interfering with others, we enhance the independence of heterogeneous knowledge. Inspired by Helmholtz Free Energy (HFE) theory, we introduce an anchor-based HFE self-normalization strategy to align the heterogeneous knowledge effectively. This method ensures consistent energy levels across all classifiers, allowing the appropriate classifier to achieve the highest confidence for in-distribution data. Extensive experiments on CIFAR-100 and ImageNet-100 datasets demonstrate that MEC significantly outperforms existing heterogeneous multi-teacher knowledge distillation methods, achieving an average accuracy improvement of over 10%.
Primary Area: transfer learning, meta learning, and lifelong learning
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 9979
Loading