Born Again Neural RankersDownload PDF

Published: 28 Jan 2022, Last Modified: 13 Feb 2023ICLR 2022 SubmittedReaders: Everyone
Keywords: learning to rank, knowledge distillation, neural networks
Abstract: We introduce Born Again neural Rankers (BAR) in the Learning to Rank (LTR) setting, where student rankers, trained in the Knowledge Distillation (KD) framework, are parameterized identically to their teachers. Unlike the existing ranking distillation work, which pursues a good trade-off between performance and efficiency, BAR adapts the idea of Born Again Networks (BAN) to ranking problems and significantly improves ranking performance of students over the teacher rankers without increasing model capacity. By examining the key differences between ranking distillation and common distillation for classification problems, we find that the key success factors of BAR lie in (1) an appropriate teacher score transformation function, and (2) a novel listwise distillation framework, both are specifically designed for ranking problems and are rarely studied in the knowledge distillation literature. Using the state-of-the-art neural ranking structures, BAR is able to push the limits of neural rankers above a recent rigorous benchmark study, and significantly outperforms strong gradient boosted decision tree based models on 7 out of 9 key metrics, the first time in the literature. In addition to the strong empirical results, we give theoretical explanations on why listwise distillation is effective for neural rankers.
One-sentence Summary: Born again neural rankers can achieve state-of-the-art ranking performance
7 Replies

Loading