Just Resource Allocation? How Algorithmic Predictions and Human Notions of Justice InteractDownload PDFOpen Website

11 Jan 2023OpenReview Archive Direct UploadReaders: Everyone
Abstract: We examine justice in data-aided decisions in the context of a scarce societal resource allocation problem. Non-experts (recruited on Amazon Mechanical Turk) have to determine which homeless households to serve with limited housing assistance. We empirically elicit decision-maker preferences for whether to prioritize more vulnerable households or households who would best take advantage of more intensive interventions. We present three main findings. (1) When vulnerability or outcomes are quantitatively conceptualized and presented, humans (at a single point in time) are remarkably consistent in making either vulnerability- or outcome-oriented decisions. (2) Prior exposure to quantitative outcome predictions has a significant effect and changes the preferences of human decision-makers from vulnerability-oriented to outcome-oriented about one-third of the time. (3) Presenting algorithmically-derived risk predictions in addition to household descriptions reinforces decision-maker preferences. Among the vulnerability-oriented, presenting the risk predictions leads to a significant increase in allocations to the more vulnerable household, whereas among the outcome-oriented it leads to a significant decrease in allocations to the more vulnerable household. These findings emphasize the importance of explicitly aligning data-driven decision aids with system-wide allocation goals.
0 Replies

Loading