Branched Multi-Task Networks: Deciding What Layers To ShareDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Blind SubmissionReaders: Everyone
Keywords: Multi-Task Learning, Neural Network Architectures, Deep learning, Efficient Architectures
TL;DR: A method for the automated construction of branched multi-task networks with strong experimental evaluation on diverse multi-tasking datasets.
Abstract: In the context of multi-task learning, neural networks with branched architectures have often been employed to jointly tackle the tasks at hand. Such ramified networks typically start with a number of shared layers, after which different tasks branch out into their own sequence of layers. Understandably, as the number of possible network configurations is combinatorially large, deciding what layers to share and where to branch out becomes cumbersome. Prior works have either relied on ad hoc methods to determine the level of layer sharing, which is suboptimal, or utilized neural architecture search techniques to establish the network design, which is considerably expensive. In this paper, we go beyond these limitations and propose a principled approach to automatically construct branched multi-task networks, by leveraging the employed tasks' affinities. Given a specific budget, i.e. number of learnable parameters, the proposed approach generates architectures, in which shallow layers are task-agnostic, whereas deeper ones gradually grow more task-specific. Extensive experimental analysis across numerous, diverse multi-tasking datasets shows that, for a given budget, our method consistently yields networks with the highest performance, while for a certain performance threshold it requires the least amount of learnable parameters.
Data: [CelebA](https://paperswithcode.com/dataset/celeba), [Cityscapes](https://paperswithcode.com/dataset/cityscapes), [Taskonomy](https://paperswithcode.com/dataset/taskonomy)
Original Pdf: pdf
9 Replies

Loading