Keywords: multi-task learning, multi-domain learning, low-rank tensors, multipath networks
TL;DR: Learn multiple functional paths for different tasks and domains using a shared network with low-rank tensor factors
Abstract: Multi-task and multi-domain learning methods seek to learn multiple tasks/domains, jointly or one after another, using a single unified network. The key challenge and opportunity is to exploit shared information across tasks and domains to improve the efficiency of the unified network. The efficiency can be in terms of accuracy, storage cost, computation, or sample complexity. In this paper, we propose a factorized tensor network (FTN) that can achieve accuracy comparable to independent single-task/domain networks with a small number of additional parameters. FTN uses a frozen backbone network from a source model and incrementally adds task/domain-specific low-rank tensor factors to the shared frozen network. This approach can adapt to a large number of target domains and tasks without catastrophic forgetting. Furthermore, FTN requires a significantly smaller number of task-specific parameters compared to existing methods. We performed experiments on widely used multi-domain and multi-task datasets. We observed that FTN achieves similar accuracy as single-task/domain methods while using 2--6\% additional parameters per task. We also demonstrate the effectiveness of FTN with domain adaptation for image generation.
Supplementary Material: pdf
Submission Number: 5470
Loading