Abstract: Academics, activists, and regulators are increasingly urging companies to develop and deploy sociotechnical systems that are fair and
unbiased. Achieving this goal, however, is complex: the developer
must (1) deeply engage with social and legal facets of “fairness” in
a given context, (2) develop software that concretizes these values,
and (3) undergo an independent algorithm audit to ensure technical correctness and social accountability of their algorithms. To
date, there are few examples of companies that have transparently
undertaken all three steps.
In this paper we outline a framework for algorithmic auditing by
way of a case-study of pymetrics, a startup that uses machine learning to recommend job candidates to their clients. We discuss how
pymetrics approaches the question of fairness given the constraints
of ethical, regulatory, and client demands, and how pymetrics’ software implements adverse impact testing. We also present the results
of an independent audit of pymetrics’ candidate screening tool.
We conclude with recommendations on how to structure audits
to be practical, independent, and constructive, so that companies
have better incentive to participate in third party audits, and that
watchdog groups can be better prepared to investigate companies.
0 Replies
Loading