Abstract: This study introduces Robust Multi-Task Gradient Boosting (R-MTGB), a framework designed to handle heterogeneous and outlier tasks in Multi-Task Learning. It combines shared representation learning, outlier-aware task partitioning, and task-specific fine-tuning. Results on synthetic and real-world data show improved robustness and predictive performance compared to conventional methods.
Loading