Cross-Scenario Knowledge Transfer via Selective Expert Distillation for Multi-Scenario Recommendation
Abstract: Transferring knowledge across different recommendation scenarios remains challenging due to domain shift and negative transfer. We introduce SEDRec, a selective expert distillation framework that learns specialized experts for each scenario while enabling adaptive knowledge transfer across them. SEDRec employs a novel transfer gate guided by meta-learning signals to control when and how inter-scenario knowledge should be fused. This selective mechanism avoids negative transfer while enhancing data-scarce scenarios. Comprehensive experiments on three multi-scenario benchmarks show that SEDRec significantly boosts recommendation performance, particularly in low-resource and emerging scenarios.
Loading