DYNASHARE: DYNAMIC NEURAL NETWORKS FOR MULTI-TASK LEARNINGDownload PDF

29 Sept 2021 (modified: 13 Feb 2023)ICLR 2022 Conference Withdrawn SubmissionReaders: Everyone
Keywords: multi-task learning, dynamic networks, adaptive inference, neural network
Abstract: Parameter sharing approaches for deep multi-task learning share a common intuition: for a single network to perform multiple prediction tasks, the network needs to support multiple specialized execution paths. However, previous parameter sharing approaches have relied on a static network structure for each task. In this paper, we propose to increase the capacity for a single network to support multiple tasks by radically increasing the space of possible specialized execution paths. DynaShare is a new approach to deep multi-task learning that learns from the training data a hierarchical gating policy consisting of a task-specific policy for coarse layer selection and gating units for individual input instances, which work together to determine the execution path at inference time. Experimental results on standard multi-task learning benchmark datasets demonstrate the potential of the proposed approach.
One-sentence Summary: DynaShare is a new approach to deep multi-task learning that learns from the training data a hierarchical gating policy consisting of a task-specific policy for coarse layer selection and gating units for individual input instances.
6 Replies

Loading