Chopout: A Simple Way to Train Variable Sized Neural Networks at Once

Tatsuya Shirakawa

Oct 16, 2018 NIPS 2018 Workshop CDNNRIA Blind Submission readers: everyone
  • Abstract: Large deep neural networks require huge memory to run and their running speed is sometimes too slow for real applications. Therefore network size reduction with keeping accuracy is crucial for practical applications. We present a novel neural network operator, chopout, with which neural networks are trained, even in a single training process, so as to truncated sub-networks perform as well as possible. Chopout is easy to implement and integrate into most type of existing neural networks. Furthermore it enables to reduce size of networks and latent representations even after training just by truncating layers. We show its effectiveness through several experiments.
  • TL;DR: We present a novel simple operator, chopout, with which neural networks are trained, even in a single training process, so as to truncated sub-networks perform as well as possible.
  • Keywords: deep neural networks, channel pruning, representation pruning
0 Replies

Loading