Generalizing Deep Multi-task Learning with Heterogeneous Structured NetworksDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
TL;DR: a distributed latent-space based knowledge-sharing framework for deep multi-task learning
Abstract: Many real applications show a great deal of interest in learning multiple tasks from different data sources/modalities with unbalanced samples and dimensions. Unfortunately, existing cutting-edge deep multi-task learning (MTL) approaches cannot be directly applied to these settings, due to either heterogeneous input dimensions or the heterogeneity in the optimal network architectures of different tasks. It is thus demanding to develop knowledge-sharing mechanism to handle the intrinsic discrepancies among network architectures across tasks. To this end, we propose a flexible knowledge-sharing framework for jointly learning multiple tasks from distinct data sources/modalities. The proposed framework allows each task to own its task (data)-specific network design, via utilizing a compact tensor representation, while the sharing is achieved through the partially shared latent cores. By providing more elaborate sharing control with latent cores, our framework is effective in transferring task-invariant knowledge, yet also being efficient in learning task-specific features. Experiments on both single and multiple data sources/modalities settings display the promising results of the proposed method, especially favourable in insufficient data scenarios.
Keywords: deep multi-task learning, heterogenous network architectures, tensor representation
Original Pdf: pdf
4 Replies

Loading