Long-tail Inductive Feature Transfer for Knowledge-aware Multi-Behavior Recommendation

Cheng Li, Yong Xu, Suhua Tang, Weiguo Wang, Xin He, Jinde Cao

Published: 01 Jan 2025, Last Modified: 28 Nov 2025IEEE Transactions on Big DataEveryoneRevisionsCC BY-SA 4.0
Abstract: Multi-Behavior recommendation captures fine-grained user intents by jointly modeling multiple interaction behaviors, significantly improving recommendation accuracy and diversity. However, existing methods often overlook complex semantic relations between items and entities. Moreover, constructing separate subgraphs for different interaction types frequently results in sparse graph structures, which exacerbates the long-tail problem. To address these challenges, we propose Long-tail Inductive Feature Transfer (LIFT), a method for knowledge-aware multi-behavior recommendation. To mitigate uneven node distribution in graph structures, we introduce a knowledge transfer-based feature reconstruction mechanism. Specifically, we first drop a portion of the neighbors of head nodes to construct proxy representations for tail nodes, training a reconstructor on the tail proxy to reconstruct the original head node features. The trained reconstructor is then used to backfill missing neighbor information for tail nodes, thereby achieving a more balanced feature distribution across nodes. Furthermore, we integrate multi-behavior and semantic contrastive learning to jointly optimize the representations. Extensive experiments on four datasets demonstrate that LIFT outperforms state-of-the-art methods, with further analysis validating its uniformity in representation learning. The source code is available at: https://github.com/city59/LIFT.
Loading