Abstract: As multimedia systems like Tiktok and Youtube become increasingly prevalent, there is a growing demand for effective recommendation techniques. However, current recommendation methods often rely on categorical identity features that cannot be shared between different platforms, making fine-tuning models for new scenarios challenging. Displayed content on these platforms often contain multimedia information, leading to a mixture-of-modality (MoM) feedback scenario. In addition, building an effective RS in platforms with smaller data footprints is challenging. To address these challenges, we propose TransRec, a general-purpose model pre-trained on a large-scale recommendation dataset to learn directly from MoM feedback in an end-to-end training approach. TransRec enables transfer learning across various scenarios without relying on shared users or items and can transfer knowledge across modalities, thereby expanding the range of recommendation tasks it can accomplish. We empirically study TransRec’s transferring ability in four real-world recommendation settings from distinct platforms, examining its effects by scaling source and target data size. Our results show that learning neural recommendation models from MoM feedback can realize a promising way to create general-purpose recommender systems. Additionally, we build an MoM dataset (https://github.com/jieWANGforwork/TransRec) for research.
Loading