Fast Neural Network Adaptation via Parameters Remapping


Sep 25, 2019 ICLR 2020 Conference Blind Submission readers: everyone Show Bibtex
  • Abstract: Deep neural networks achieve remarkable performance in many computer vision tasks. However, for most semantic segmentation (seg) and object detection (det) tasks, the backbone of the network directly reuses the network manually designed for classification tasks. Utilizing a network pre-trained on ImageNet as the backbone has been a popular practice for seg/det challenges. However, because of the gap between different tasks, adapting the network directly to the target task could bring performance promotion. Some recent neural architecture search (NAS) methods search for the backbone of seg/det networks. ImageNet pre-training of the search space representation or the searched network bears huge computational cost. In this paper, we propose a fast neural network adaptation method FNA, which can adapt the manually designed network on ImageNet to the new seg/det tasks efficiently. We adopt differentiable NAS to adapt the architecture of the network. We first expand the manually designed network to a super network which is the representation of the search space. Then we successively conduct the adaptation on the architecture-level and parameter-level. Our designed parameters-remapping paradigm accelerates the adaptation process. Our experiments include both seg and det tasks. We conduct adaptation on the MobileNetV2 network. FNA demonstrates performance promotion compared with both manually and NAS designed networks. The total computational cost of FNA is much less than many SOTA seg/det NAS methods, 1737x less than DPC, 6.8x less than Auto-DeepLab and 7.4x less than DetNAS.
0 Replies