A knowledge distillation-based multi-scale relation-prototypical network for cross-domain few-shot defect classification

Published: 01 Jan 2024, Last Modified: 02 Mar 2025J. Intell. Manuf. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Surface defect classification plays a very important role in industrial production and mechanical manufacturing. However, there are currently some challenges hindering its use. The first is the similarity of different defect samples makes classification a difficult task. Second, the lack of defect samples leads to poor accuracies when using deep learning methods. In this paper, we first design a novel backbone network, ResMSNet, which draws on the idea of multi-scale feature extraction for small discriminative regions in defect samples. Then, we introduce few-shot learning for defect classification and propose a Relation-Prototypical network (RPNet), which combines the characteristics of ProtoNet and RelationNet and provides classification by linking the prototypes distances and the nonlinear relation scores. Next, we consider a more realistic scenario where the base dataset for training the model and target defect dataset for applying the model are usually obtained from domains with large differences, called cross-domain few-shot learning. Hence, we further improve RPNet to KD-RPNet inspired by knowledge distillation methods. Through extensive comparative experiments and ablation experiments, we demonstrate that either our ResMSNet or RPNet proves its effectiveness and KD-RPNet outperforms other state-of-the-art approaches for few-shot defect classification.
Loading