Grassmannian initialization: Neural network initialization using sub-space packingDownload PDF

28 May 2019 (modified: 05 May 2023)Submitted to ICML Deep Phenomena 2019Readers: Everyone
Keywords: Model initialization, subspace packing, training
TL;DR: Initialize weights using off-the-shelf Grassmannian codebooks, get faster training and better accuracy
Abstract: We recently observed that convolutional filters initialized farthest apart from each other using offthe- shelf pre-computed Grassmannian subspace packing codebooks performed surprisingly well across many datasets. Through this short paper, we’d like to disseminate some initial results in this regard in the hope that we stimulate the curiosity of the deep-learning community towards considering classical Grassmannian subspace packing results as a source of new ideas for more efficient initialization strategies.
1 Reply

Loading