What does Hyperbolic Space Roles for Graph Learning?Download PDF

Anonymous

02 Mar 2022 (modified: 03 Oct 2023)Submitted to GTRL 2022Readers: Everyone
Keywords: Geometric learning, Recommender System, hyperbolic spaces
TL;DR: we take the recommendation technique, collaborative filtering, as the medium, to investigate the behaviors of hyperbolic and Euclidean graph models.
Abstract: Models, like graph neural networks, built upon hyperbolic spaces have achieved great success in tree-structured data, but it is not clear in what aspect the hyperbolic space plays an effective role. To address this issue, in this paper, we take the recommender system, a user-item graph-structured networks, as an example for an in-depth analysis. Given that the prevalence of the power-law distribution in user-item graph-structured networks, hyperbolic space has attracted considerable attention and achieved impressive performance recently. The advantage of hyperbolic recommendation lies in that its exponentially increasing capacity is well-suited to describe the power-law distributed user-item network whereas the Euclidean equivalent is deficient in. Nonetheless, it remains unclear which aspects are effective or counterproductive with hyperbolic models. To address the above concerns, we take the recommendation technique, collaborative filtering, as the medium, to investigate the behaviors of hyperbolic and Euclidean graph models. The results reveal that tail nodes get more emphasise in the hyperbolic model than that built upon Euclidean space, but there is still ample room for improvement; head nodes receive modest attention in hyperbolic space, which could be considerably improved; and nonetheless, the hyperbolic models show more competitive performance.
1 Reply

Loading