Uniform Graph Pre-training and Prompting for Transferable Recommendation

Published: 01 Jan 2025, Last Modified: 06 Oct 2025ACM Trans. Inf. Syst. 2025EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Recently, the paradigm of pre-training and fine-tuning has achieved impressive performance owing to their ability to transfer general knowledge from pre-trained domain to target domain. Meanwhile, graph neural networks (GNNs) have gained prominence in recommender systems. However, there is a lack of unified pre-training and fine-tuning paradigms in graph-based recommendation systems. Applying pre-training and fine-tuning in graph-based recommendation is challenging due to the unique characteristics of recommendation data, including the non-uniform representation, negative transfer effects, and skewed data distributions. To overcome these challenges, we introduce pre-training and prompting recommendation (ProRec), a novel model that synergizes uniform graph pre-training with prompt-tuning for recommendation systems. Specifically, to address the challenge of inconsistent features across different recommendation datasets, ProRec constructs unified input features at the subgraph level and uses a graph auto-encoder for pre-training, laying the foundation for uniform knowledge transfer from the pre-trained domain to the downstream domain. Additionally, ProRec employs prompt-tuning during the fine-tuning phase, which, in a parameter-efficient manner, enhances the generalization of pre-trained knowledge to downstream tasks thereby reducing negative transfer effects. Furthermore, a cross-layer contrastive learning strategy is adopted to eliminate uneven data distribution, promoting more evenly distributed and informative representations. Finally, extensive benchmark comparisons have demonstrated that ProRec outperforms the latest state-of-the-art methods. The source code necessary for replication is available at https://github.com/Code2Q/ProRec.
Loading