Sequence-Graph Duality: Unifying User Modeling with Self-Attention for Sequential RecommendationDownload PDF

Published: 22 Nov 2022, Last Modified: 05 May 2023NeurIPS 2022 GLFrontiers WorkshopReaders: Everyone
Keywords: User modeling, self-attention, graph learning
TL;DR: Combining sequence and graph modeling of users for better sequential recommendation performance using Transformers.
Abstract: User modeling is of great importance in personalization services. Many existing methods treat users as interaction sequences to capture users' evolving interests. Another line of research models each user as a user graph in which the users' interactions are modeled as nodes. Nodes (interactions) in user graphs are connected via edges that reflect certain relations such as item similarity. The graph-based user modeling can flexibly store item relationships. In this work, we introduce a novel user representation, Heterogeneous User Graph (HUG), which unifies sequence- and graph-based user modeling to take advantage of both methods. A HUG is associated with two types of edges: sequential edges that preserve the order of interactions and collaborative edges that connect collaboratively similar items (i.e., items interacted by similar sets of users). To learn latent user representations for recommendation tasks, we propose a multi-head attention-based architecture called Heterogeneous User Graph Transformer (HUGT). HUGT is developed on the basis of SASRec and can concurrently capture the sequential pattern and graph topology encoded in HUGs. We conduct experiments on four real-world datasets from three different application domains. Experimental results show that (1) jointly modeling users as sequences and graphs with HUG provides better recommendation performance over sequence-only and graph-only user modeling; (2) HUGT is effective in learning user latent representations from HUGs; (3) HUGT outperforms the baselines by up to 10\% on datasets with long sequences and aligns with the state-of-the-art performance on datasets with short sequences.
0 Replies