Toward Fairness Across Skin Tones in Dermatological Image Processing

Published: 01 Jan 2023, Last Modified: 29 Jul 2025MIPR 2023EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Skin cancer is a prevalent and concerning form of cancer, with an annual incidence rate estimated to be more than 3 million cases in the US. In recent years, the field of medical image processing has made a remarkable progress in the domain of skin cancer detection, surpassing the diagnostic capabilities of dermatologists in certain settings. However, it has been reported that the performance of these deep learning detection models varies significantly across different skin tones (e.g., light versus dark), motivating the need for fair and unbiased classification results. Here, we evaluate DeepDerm [10], a state-of-the-art skin cancer detection model, specifically focusing on its performance across skin types classified by the Fitzpatrick Skin Tones (FST). By analyzing the model's accuracy and fairness, we observe notable discrepancies in its performance across different FST categories. We propose a novel architecture that leverages fine-tuning, an ensemble architecture, and fairness-based resampling for supporting high accuracy and fairness in skin cancer detection. The proposed framework demonstrates promising outcomes, marking a significant stride toward achieving fairness and accuracy in dermatological image processing.
Loading