Abstract: Science is constantly developing as new information is discovered. Papers discredited by the scientific community may be retracted. Such papers might have been cited before they were retracted (as well as afterwards), which potentially could spread a chain of unreliable information. To address this, Fu and Schneider (2020) introduced the keystone framework for auditing how and whether a paper fundamentally depends on another paper, and proposed that an alerting system be developed to flag papers that fundamentally depend on retracted papers. The need for expert labor is the main challenge of such alerting in such systems. This paper tests whether a flowchart process for non-experts could accurately assess dependencies between papers, reducing the need for expert assessment. We do this by developing such a process and testing it on citations to one highly cited retracted paper. In our case study, non-experts using our process can resolve the question of dependency in about half the cases. Two annotators had 92.9% agreement on 85 papers annotated, with 100% agreement after discussion. In future work we will assess the reliability of non-experts’ decisions as compared to experts, and identify possibilities for automation.
0 Replies
Loading