Keywords: crowdsourcing, peer review, coding theory
Abstract: Conference peer review aims to accurately assess paper quality while minimizing review load. This paper explores optimal conference protocols --- rules for designing review tasks to reviewers and inferring paper quality based on the noisy review. The widely used *direct review* protocol assigns multiple independent reviewers to each paper in an *isolated* and *parallel* manner. However, as submission volumes grow, more complex protocols have developed, e.g., two-phase review and meta-review.
In this paper, we investigate whether and when these more complex *joint* and *adaptive* protocols can reduce the *review load ratio*, the number of review tasks per paper. Using tools from information theory and coding theory, we establish the following results:
- We prove that the optimal load ratio for isolated protocols is $\Theta(\ln n/\epsilon)$, where $n$ is the number of papers and $\epsilon$ is the error probability indicating that the review load ratio increases as the number of papers grows.
- We prove that the optimal load ratio of joint protocols is a constant dependent on the agents' noise levels and independent of both $n$ and $\epsilon$. This suggests that joint protocols—including two-phase review—can dramatically reduce the review burden.
- We empirically explore the design of two-phase review protocols and find that selecting the borderline (ambiguous) papers for the second phase review can significantly increase the accuracy compared to the conventional selection of a better fraction of promising papers for the second phase.
Supplementary Material: zip
Primary Area: other topics in machine learning (i.e., none of the above)
Submission Number: 14435
Loading