The Diversity Crisis of Software Engineering for Artificial Intelligence

Published: 01 Jan 2020, Last Modified: 30 May 2025IEEE Softw. 2020EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Artificial Intelligence (AI) is experiencing a "diversity crisis."1 Several reports1-3 have shown how the breakthrough of modern AI has not yet been able to improve on existing diversity challenges regarding gender, race, geography, and other factors, neither for the end users of those products nor the companies and organizations building them. Plenty of examples have surfaced in which biased data engineering practices or existing data sets led to incorrect, painful, or sometimes even harmful consequences for unassuming end users.4 The problem is that ruling out such biases is not straightforward due to the sheer number of different bias types.5 To have a chance to eliminate as many biases as possible, most of the experts agree that the teams and organizations building AI products should be made more diverse.1-3 This harkens back to Linus' law6 for open source development ("given enough eyeballs, all bugs are shallow") but applied to the development process of AI products.
Loading