Statistical Methods for Auditing the Quality of Manual Content ReviewsDownload PDF

01 Mar 2023 (modified: 30 May 2023)Submitted to Tiny Papers @ ICLR 2023Readers: Everyone
Keywords: Content moderation, Audit, Statistical analysis, Human review error
TL;DR: This paper evaluates quantitative methods to measure and to minimize audit risks as a result of human reviews in the form of statistical analysis.
Abstract: Large technology firms face the problem of moderating content on their online platforms for compliance with laws and policies. To accomplish this at the scale of billions of pieces of content per day, a combination of human and machine review are necessary to label content. Subjective judgement and bias are of concern to both human annotated content as well as to auditors who may be employed to evaluate the quality of such annotations in conformance with law and/or policy. To address this concern, this paper presents a novel application of statistical analysis methods to identify human error and these sources of audit risk.
6 Replies

Loading