Grassmannian initialization: Neural network initialization using sub-space packing

May 28, 2019 Blind Submission readers: everyone
  • Keywords: Model initialization, subspace packing, training
  • TL;DR: Initialize weights using off-the-shelf Grassmannian codebooks, get faster training and better accuracy
  • Abstract: We recently observed that convolutional filters initialized farthest apart from each other using offthe- shelf pre-computed Grassmannian subspace packing codebooks performed surprisingly well across many datasets. Through this short paper, we’d like to disseminate some initial results in this regard in the hope that we stimulate the curiosity of the deep-learning community towards considering classical Grassmannian subspace packing results as a source of new ideas for more efficient initialization strategies.
0 Replies