Individual Fairness as an Extension of Group Fairness

16 Sept 2023 (modified: 11 Feb 2024)Submitted to ICLR 2024EveryoneRevisionsBibTeX
Primary Area: societal considerations including fairness, safety, privacy
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Keywords: Individual fairness, group fairness, consistency, generalised entropy indices
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2024/AuthorGuide.
TL;DR: There is no conflict / trade-off between group and individual fairness.
Abstract: Since its formal definition in 2011, _individual fairness_ has received relatively little attention from the machine learning community in comparison to _group fairness_. The reasons for this are several-fold. In order to implement it, one must define a similarity metric; obtaining this is a non-trivial task and an active research area. According to individual fairness, discontinuity in modelling, and thus deterministic classification, is inherently unfair. To achieve individual fairness we must turn to probabilistic models in which predictions are randomised. For many, this flies in the face of logic. Perhaps most importantly, researchers have conflicting views on its compatibility with group fairness. In this work we present arguments which resolve conflicting research on the nature of individual fairness. We clarify important defining features of individual fairness, framing it as an extension of group fairness, rather than acting in opposition to it. We review empirical evidence of the trade-off between group and individual fairness and derive a new representation for the associated individual fairness metric (which we term _individual cost_). With this new representation we prove that individual cost is strongly related to utility. We conclude that empirical evidence does not support the existence of a trade-off between group and individual fairness but rather, likely demonstrates the well known trade-off between fairness and utility.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors' identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 754
Loading