Graph-Theoretic Insights into Bayesian Personalized Ranking for Recommendation

Published: 18 Sept 2025, Last Modified: 29 Oct 2025NeurIPS 2025 posterEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Graph self-supervised learning, Bayesian Personalized Ranking, latent hyperbolic geometry, loss function, Network geometry
TL;DR: This paper introduces BPR+, a novel loss function for graph self-supervised learning that extends Bayesian Personalized Ranking (BPR) by incorporating even-hop paths to better capture global connectivity and topological structure.
Abstract: Graph self-supervised learning (GSL) is essential for processing graph-structured data, reducing the need for manual labeling. Traditionally, this paradigm has extensively utilized Bayesian Personalized Ranking (BPR) as its primary loss function. Despite its widespread application, the theoretical analysis of its node relations evaluation have remained largely unexplored. This paper employs recent advancements in latent hyperbolic geometry to deepen our understanding of node relationships from a graph-theoretical perspective. We analyze BPR’s limitations, particularly its reliance on local connectivity through 2-hop paths, which overlooks global connectivity and the broader topological structure. To address these shortcomings, we purpose a novel loss function, BPR+, designed to encompass even-hop paths and better capture global connectivity and topological nuances. This approach facilitates a more detailed measurement of user-item relationships and improves the granularity of relationship assessments. We validate BPR+ through extensive empirical testing across five real-world datasets and demonstrate its efficacy in refining graph self-supervised learning frameworks. Additionally, we explore the application of BPR+ in drug repositioning, highlighting its potential to support pharmaceutical research and development. Our findings not only illuminate the success factors of previous methodologies but also offer new theoretical insights into this learning paradigm.
Primary Area: Deep learning (e.g., architectures, generative models, optimization for deep networks, foundation models, LLMs)
Submission Number: 28100
Loading