Examining the impact of bias mitigation algorithms on the sustainability of ML-enabled systems: A benchmark study
Abstract: Context:As machine learning (ML) systems become increasingly prevalent across various industries, concerns regarding fairness have intensified. Bias mitigation algorithms—that aim to reduce bias in ML models—serve as solutions to mitigate this issue. However, these techniques can affect more than just social sustainability. They may alter the computational overhead and energy usage of ML systems, affecting their environmental sustainability. Similarly, they can influence businesses’ economic sustainability by shaping resource allocation and consumer trust.Goal:This work aims to provide a benchmark study of the implications of applying bias mitigation algorithms on the sustainability of ML solutions. We first corroborate previous findings by examining their effect on social sustainability metrics. Additionally, we complement existing studies by offering a comprehensive analysis of how bias mitigation affects environmental and economic sustainability, aiming to highlight trade-offs for practitioners designing ML solutions.Method:We evaluate six bias mitigation algorithms by conducting 3,360 experiments across multiple configurations of four ML algorithms and datasets. From these experiments, we compute metrics for social, environmental, and economic sustainability, evaluating them using statistical analysis.Results:Our quantitative findings show that all bias mitigation algorithms affect the three sustainability dimensions differently, indicating that applying these algorithms involves complex trade-offs. Furthermore, we expand our discussion with qualitative insights that arise from our results, also providing implications for both research and practice.Conclusions:Our study emphasizes the need for a deeper investigation into the trade-offs bias mitigation algorithms introduce and how they impact various non-functional requirements of ML systems.Editor’s note: Open Science material was validated by the Journal of Systems and Software Open Science Board.
Loading