A Simple Approach to Balance Task Loss in Multi-Task LearningDownload PDFOpen Website

Published: 01 Jan 2021, Last Modified: 12 May 2023IEEE BigData 2021Readers: Everyone
Abstract: In multi-task learning, the training losses of different tasks are varying. There are many works to handle this situation and we classify them into five categories. In this paper, we propose a Balanced Multi-Task Learning (BMTL) framework. Different from existing studies which rely on task weighting, the BMTL framework proposes to transform the training loss of each task to balance different tasks based on an intuitive idea that tasks with larger training losses will receive more attention during the optimization procedure. We analyze the transformation function and derive necessary conditions as well as some properties. The proposed BMTL framework is very simple and it can be combined with most multi-task learning models. Empirical studies show the state-of-the-art performance of the proposed BMTL framework.
0 Replies

Loading