ESSR: Evolving Sparse Sharing Representation for Multitask Learning

Published: 01 Jan 2024, Last Modified: 17 Jan 2025IEEE Trans. Evol. Comput. 2024EveryoneRevisionsBibTeXCC BY-SA 4.0
Abstract: Multitask learning (MTL) uses knowledge transfer among tasks to improve the generalization performance of all tasks. For deep MTL, knowledge transfer is often implemented via sharing all hidden features of tasks. A major shortcoming is that it can lead to negative knowledge transfer across tasks when task correlation is weak. To overcome it, this article proposes an evolutionary method to learn sparse sharing representations adaptively. By embedding the neural network optimization into evolutionary multitasking, our proposed method finds an optimal combination of tasks and sharing features. It can identify negative correlation and redundant features and then remove them from the hidden feature set. Thus, an optimal sparse sharing subnetwork can be produced for each task. Experiment results show that the proposed method achieve better learning performance with a smaller inference model than other related methods.
Loading