Impact of feedback on crowdsourced visual quality assessment with paired comparisons

Published: 01 Jan 2024, Last Modified: 13 Nov 2024QoMEX 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: This paper presents a comprehensive investigation into the effects of immediate feedback on crowdworkers’ performance in subjective image quality assessment tasks using paired comparisons. The study is motivated by the need for reliable and efficient crowdsourcing tasks for image quality assessment. A large-scale experiment involving 200 participants was conducted, where participants completed 120 paired comparisons with and without feedback. The feedback informed the workers of the correctness of their responses to comparisons. Almost all of the participants (97%) preferred receiving feedback. The results indicate that feedback reduced response time, improved user experience, and did not cause a bias in the estimation of the just noticeable difference (JND). On the other hand, feedback did not significantly affect accuracy, correlation with the ground truth, or create a learning effect. This study contributes to the field by being one of the first to examine the impact of feedback on crowdworker performance in subjective image quality assessment tasks. The dataset which includes the images and ratings can be accessed at https://database.mmsp-kn.de/feedback-study-dataset.html.
Loading