Grassmannian initialization: Neural network initialization using sub-space packingDownload PDF

May 28, 2019 (edited Jun 04, 2019)ICML 2019 Workshop Deep Phenomena Blind SubmissionReaders: Everyone
  • Keywords: Model initialization, subspace packing, training
  • TL;DR: Initialize weights using off-the-shelf Grassmannian codebooks, get faster training and better accuracy
  • Abstract: We recently observed that convolutional filters initialized farthest apart from each other using offthe- shelf pre-computed Grassmannian subspace packing codebooks performed surprisingly well across many datasets. Through this short paper, we’d like to disseminate some initial results in this regard in the hope that we stimulate the curiosity of the deep-learning community towards considering classical Grassmannian subspace packing results as a source of new ideas for more efficient initialization strategies.
1 Reply