SMAFace: Sample Mining Guided Adaptive Loss for Face Recognition

20 Sept 2023 (modified: 25 Mar 2024)ICLR 2024 Conference Withdrawn SubmissionEveryoneRevisionsBibTeX
Keywords: face recognition, sample mining, low-quality images
TL;DR: We propose an innovative FR algorithm that enhances performance by incorporating sample mining into conventional margin-based methods.
Abstract: Traditional face recognition (FR) algorithms often rely merely on margin-based softmax loss functions. However, due to noisy training data and varied image quality in datasets, these models often falter when dealing with low-quality images. To address this issue, we introduce SMAFace, an innovative FR algorithm that enhances performance by incorporating sample mining into conventional margin-based methods. At its core, SMAFace focuses on prioritizing information-dense samples, namely hard samples or easy samples, which present more distinctive features. In this study, we employ a probability-driven mining strategy, enabling the model to adeptly navigate hard samples, thereby bolstering its robustness and adaptability. The mathematical evaluation and empirical tests of SMAFace indicate its effectiveness. Moreover, experimental results reveal that our approach surpasses the state-of-the-art (SoTA) on four renowned datasets (CPLFW, VGG2-FP, IJB-B and TinyFace), highlighting its potential and efficiency.
Supplementary Material: zip
Primary Area: representation learning for computer vision, audio, language, and other modalities
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 2380
Loading