TransformerRank: Enhancing Listwise Ranking with Advanced Attention MechanismsDownload PDF

Anonymous

16 Dec 2023ACL ARR 2023 December Blind SubmissionReaders: Everyone
Abstract: In the realm of recommendation systems and search engine optimization, the comprehensive understanding of listwise item dependencies has emerged as a pivotal challenge. Traditional ranking methods, predominantly pointwise or pairwise, have been limited in capturing the intricate dynamics within item lists. In this study, we developed TransformerRank, a novel approach specifically tailored for the complexities of listwise ranking. This method innovatively employs a custom transformer model within a sliding window technique, extending beyond the capabilities of conventional ranking algorithms. Our extensive experiments conducted on diverse datasets, including TripClick, Yahoo!, and ORCAS, demonstrated TransformerRank's superiority. It consistently outperformed established methods across key metrics such as NDCG@10 and MAP. Additionally, an ablation study was executed to determine the balance between accuracy and computational efficiency, underscoring the practicality of our approach. TransformerRank provides a significant advancement in the field of listwise ranking. It not only enhances the accuracy and efficiency of ranking systems but also offers a deeper insight into the dynamics of item interdependencies. This research expands the potential applications in data science and natural language processing, setting a new benchmark for future explorations in leveraging listwise dependencies in sequence data.
Paper Type: long
Research Area: Information Retrieval and Text Mining
Contribution Types: Model analysis & interpretability, NLP engineering experiment
Languages Studied: English
0 Replies

Loading