EKDSC: Long-tailed recognition based on expert knowledge distillation for specific categories

Published: 2026, Last Modified: 12 Nov 2025Neural Networks 2026EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•The EKDSC method is proposed, leveraging specific categories expert teacher models to distill knowledge tailored for head, mid, and tail classes to students, significantly improving recognition performance for tail categories while maintaining high performance for head classes.•A multi-expert teacher model is designed for specific categories via primary optimization and interence supression, reducing interference from other classes, and enhancing the reliability and precision of experts.•State-of-the-art performance shows a 1–5 % improvement in accuracy over existing methods across the CIFAR-10 LT and CIFAR-100 LT datasets, and also demonstrates outstanding performance on large-scale datasets such as ImageNet-LT, iNaturalist 2018, and Places-LT.
Loading