Abstract: Face recognition have been widely used in different industries due to the advancement of deep convolutional neural networks. Although deep learning has greatly promoted the development of face recognition technology, its computing-intensive and memory-intensive features make it difficult to deploy the model on some embedded devices or mobile computing platforms. Many solutions which include Knowledge Distillation have been proposed to increase the calculation speed of model and reduce the storage space required for calculations, in this paper, we propose a novel Two Stage Knowledge Distillation which enhances the performance of knowledge distillation in face recognition and low resolution face recognition. After experimenting on the several major face datasets, our method turns out to have better results compared to the traditional optimization methods.
0 Replies
Loading