Abstract: This paper introduces ADG-QPP (Adaptive Disturbance Generation), an unsupervised Query Performance Prediction (QPP) method designed specifically for dense neural retrievers. The underlying foundation of ADG-QPP is to measure query performance based on its degree of robustness towards perturbations. Traditional QPP methods rely on predefined lexical perturbations on the query, which only apply to sparse retrieval methods and fail to maintain consistent performance across different datasets. In our work, we address these limitations by perturbing the query by injecting disturbance leveraged by the focal network-based measurements including node-based, edge-based, and cluster-based metrics, into its neural embedding representation. Rather than applying the same perturbation across all queries, our approach develops an instance-wise disturbance for each query that is then used for its perturbation. Through extensive experiments on three benchmark datasets, we demonstrate that ADG-QPP outperforms state-of-the-art baselines in terms of Kendall \(\tau\), Spearman \(\rho\), and Pearson’s \(\rho\) correlations.
External IDs:dblp:journals/ml/SaleminezhadARBB25
Loading