FALCON: Fair Face Recognition via Local Optimal Feature Normalization

Rouqaiah Al-Refai, Philipp Hempel, Clara Biagi, Philipp Terhörst

Published: 2025, Last Modified: 03 Mar 2026WACV 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Face recognition systems are widely used for identity verification in various fields. However, recent studies have highlighted bias issues related to demographic and non-demographic attributes such as accessories, haircolor, ethnicity, or gender. These biases lead to higher error rates for specific attribute subgroups. This is especially problematic in critical areas like forensics, where these systems are deployed. Addressing this issue requires a solution that reduces bias without compromising accuracy. Existing methods focus on learning less biased face representations, but they are often difficult to integrate into current systems or negatively impact overall recognition performance. This work introduces FALCON (Fair Adaptation Through Local Optimal Normalization), an effective method to increase fairness in face recognition systems. FALCON operates in an unsupervised manner, addressing bias without requiring demographic labels, and can be easily integrated as a post-processing step. It treats individuals with similar traits similarly, reducing bias in face recognition by processing each image individually. The proposed method is rigorously tested across various face recognition models and datasets, and compared with four other fairness post-processing methods. Results show that FALCON significantly enhances both fairness and accuracy. Unlike other methods, it allows seamlessly adjusting the fairness-accuracy trade-off while effectively addressing bias.
Loading