Active Ranking without Strong Stochastic TransitivityDownload PDF

Published: 31 Oct 2022, Last Modified: 08 Jan 2023NeurIPS 2022 AcceptReaders: Everyone
Keywords: ranking, noisy comparison, WST, SST, sample complexity
TL;DR: An active ranking algorithm for the WST setting is proposed, upper and lower bounds are proved and shown to be better than the state-of-the-art.
Abstract: Ranking from noisy comparisons is of great practical interest in machine learning. In this paper, we consider the problem of recovering the exact full ranking for a list of items under ranking models that do *not* assume the Strong Stochastic Transitivity property. We propose a $$\delta$$-correct algorithm, Probe-Rank, that actively learns the ranking of the items from noisy pairwise comparisons. We prove a sample complexity upper bound for Probe-Rank, which only depends on the preference probabilities between items that are adjacent in the true ranking. This improves upon existing sample complexity results that depend on the preference probabilities for all pairs of items. Probe-Rank thus outperforms existing methods over a large collection of instances that do not satisfy Strong Stochastic Transitivity. Thorough numerical experiments in various settings are conducted, demonstrating that Probe-Rank is significantly more sample-efficient than the state-of-the-art active ranking method.
Supplementary Material: pdf
9 Replies

Loading