Concatenated Tensor Networks for Deep Multi-Task LearningOpen Website

2020 (modified: 26 Jan 2023)ICONIP (5) 2020Readers: Everyone
Abstract: Deep Multi-Task Learning has achieved great success in a number of domains. However, the enormous number of parameters results in extremely large storage costs for current deep Multi-Task models. Several methods based on tensor networks were proposed to address this problem. However, the tensor train format based methods only share the information of one mode. The huge central core tensor of the tucker format is hard to be stored and optimized. To tackle these problems, we introduce a novel Concatenated Tensor Network structure, in particular, Projected Entangled Pair States (PEPS) like, into multi-task models. We name the resulted multi-task models as Concatenated Tensor Multi-Task Learning (CT-MTL).
0 Replies

Loading