Learnable Embedding Space for Efficient Neural Architecture CompressionDownload PDF

27 Sept 2018, 22:39 (edited 10 Feb 2022)ICLR 2019 Conference Blind SubmissionReaders: Everyone
  • Keywords: Network Compression, Neural Architecture Search, Bayesian Optimization, Architecture Embedding
  • TL;DR: We propose a method to incrementally learn an embedding space over the domain of network architectures, to enable the careful selection of architectures for evaluation during compressed architecture search.
  • Abstract: We propose a method to incrementally learn an embedding space over the domain of network architectures, to enable the careful selection of architectures for evaluation during compressed architecture search. Given a teacher network, we search for a compressed network architecture by using Bayesian Optimization (BO) with a kernel function defined over our proposed embedding space to select architectures for evaluation. We demonstrate that our search algorithm can significantly outperform various baseline methods, such as random search and reinforcement learning (Ashok et al., 2018). The compressed architectures found by our method are also better than the state-of-the-art manually-designed compact architecture ShuffleNet (Zhang et al., 2018). We also demonstrate that the learned embedding space can be transferred to new settings for architecture search, such as a larger teacher network or a teacher network in a different architecture family, without any training.
  • Code: [![github](/images/github_icon.svg) Friedrich1006/ESNAC](https://github.com/Friedrich1006/ESNAC) + [![Papers with Code](/images/pwc_icon.svg) 1 community implementation](https://paperswithcode.com/paper/?openreview=S1xLN3C9YX)
  • Data: [CIFAR-10](https://paperswithcode.com/dataset/cifar-10), [CIFAR-100](https://paperswithcode.com/dataset/cifar-100)
31 Replies

Loading