Abstract: Highlights•We propose a hybrid ranking architecture for passage ranking, that effectively solves the problem that ranking models are easily bewildered by the overlapping but irrelevant passages.•We propose a pooling attention mechanism called SMAP.•SMAP is combined with a pre-trained language model to identify distracting passages.•Approximately 5% absolute improvement has been achieved on WikiQA dataset, compared to the prior best approach based on the same pre-trained language model.
Loading