Query Design for Crowdsourced Clustering: Effect of Cognitive Overload and Contextual Bias

Published: 17 Jun 2024, Last Modified: 02 Jul 2024ICML 2024 Workshop MHFAIA PosterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: cognitive behavior, crowdsourcing, comparison labeling
TL;DR: Model design for crowdsouring system should consider the effect of cognitive overload and contextual bias
Abstract: Crowdsourced clustering leverages human input to group items into clusters. The design of tasks for crowdworkers, specifically the number of items presented per query, impacts answer quality and cognitive load. This work investigates the trade-off between query size and answer accuracy, revealing diminishing returns beyond 4-5 items per query. Crucially, we identify contextual bias in crowdworker responses – the likelihood of grouping items depends not only on their similarity but also on the other items present in the query. This structured noise contradicts assumptions made in existing noise models. Our findings underscore the need for more nuanced noise models that account for the complex interplay between items and query context in crowdsourced clustering tasks.
Submission Number: 44
Loading