Group Fairness Refocused: Assessing the Social Impact of ML Systems

Published: 01 Jan 2024, Last Modified: 06 May 2025SDS 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Fairness as a property of a prediction-based decision system is a question of its impact on the lives of affected people, which is only partially captured by standard fairness metrics. In this paper, we present a formal framework for the impact assessment of prediction-based decision systems based on the paradigm of group fairness. We generalize the equality requirements of standard fairness criteria to the concept of equality of expected impact, and we show that standard fairness criteria can be interpreted as special cases of this generalization. Furthermore, we provide a systematic and practical method for determining the necessary utility functions for modeling the impact. We conclude with a discussion of possible extensions of our approach.
Loading