BERT-SMAP: Paying attention to Essential Terms in passage ranking beyond BERT

Published: 2022, Last Modified: 23 Jan 2026Inf. Process. Manag. 2022EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Highlights•We propose a hybrid ranking architecture for passage ranking, that effectively solves the problem that ranking models are easily bewildered by the overlapping but irrelevant passages.•We propose a pooling attention mechanism called SMAP.•SMAP is combined with a pre-trained language model to identify distracting passages.•Approximately 5% absolute improvement has been achieved on WikiQA dataset, compared to the prior best approach based on the same pre-trained language model.
Loading