Enhanced Knowledge Distillation for Face RecognitionDownload PDFOpen Website

Published: 01 Jan 2019, Last Modified: 13 May 2023ISPA/BDCloud/SocialCom/SustainCom 2019Readers: Everyone
Abstract: Face recognition have been widely used in different industries due to the advancement of deep convolutional neural networks. Although deep learning has greatly promoted the development of face recognition technology, its computing-intensive and memory-intensive features make it difficult to deploy the model on some embedded devices or mobile computing platforms. Many solutions which include Knowledge Distillation have been proposed to increase the calculation speed of model and reduce the storage space required for calculations, in this paper, we propose a novel Two Stage Knowledge Distillation which enhances the performance of knowledge distillation in face recognition and low resolution face recognition. After experimenting on the several major face datasets, our method turns out to have better results compared to the traditional optimization methods.
0 Replies

Loading