A task assignment strategy for crowdsourcing-based web accessibility evaluation systemOpen Website

2017 (modified: 18 Nov 2022)W4A 2017Readers: Everyone
Abstract: Web accessibility evaluation aims to find the interactive barrier for people with disabilities in accessing the contents on the Web. As some of the checkpoints require human inspection for conformance evaluation, evaluating a website will usually incur an expensive cost. To address this issue, crowdsourcing-based system is used in web accessibility evaluation to elicit contributions from volunteer participants. However, some of accessibility evaluation tasks are complicated and require a certain level of expertise in evaluation. This makes the task assignment in crowdsourcing a challenging problem in that poor evaluation accuracy will be resulted when complicated tasks are assigned to inexperienced participants. To address this issue, we propose in this paper a novel task assignment strategy called Evaluator-Decision-Based Assignment (EDBA) to better leverage the participation and expertise of the volunteers. Using evaluators' historical evaluation records and experts' review, we train a minimum cost model via machine learning methods to obtain an optimal task assignment map. Experiments on Chinese Web Accessibility Evaluation System show that our method achieves high accuracy in website accessibility evaluation. Meanwhile, the balanced assignments from EDBA also enable both novices and old hands effective participation in accessibility evaluation.
0 Replies

Loading