Who Is to Blame? Responsibility Attribution in AI Systems vs Human Agents in the Field of Air Crashes
Abstract: This study aims to explore how adults assign responsibility to different agents-both Artificial Intelligence (AI) systems and human beings-in the context of an airplane crash, based on factors of criticality and pivotality. Criticality is related to the perceived importance of an agent’s actions in achieving an outcome (prospective judgements), while pivotality examines the degree to which the agent’s actions contributed to the actual outcome (retrospective judgements). Our results replicate previous findings, demonstrating that participants are sensitive to both factors. They rate agents involved in a conjunctive structure as more critical than those in a disjunctive one. Similarly, agents are held more responsible when their errors are completely pivotal to the crash. Interestingly, participants attribute more responsibility to human beings than to AI systems, but this trend is only observed in trials where the pivotality is reduced.
Loading