DensEMANN + Sparsification: Experiments for Further Shrinking Already Small Automatically Generated DenseNet
Abstract: This paper presents a few experiments that we carried out using DensEMANN (an algorithm that we are developing for automatically generating small and efficient DenseNet neural networks) and various algorithms for pruning or sparsifying neural networks at different granularity levels. The pruning algorithms that we used are based on the Lottery Ticket algorithm by Frankle and Carbin (2019), and on the Dense-Sparse-Dense (DSD) training algorithm by Han et al. (2017). Our experiments show that the pruning method based on DSD training is very efficient for reducing the parameter count of both human-designed and DensEMANN-generated neural networks while making them recover their original accuracy, and that this is especially true when sparsification is performed at the granularity level of individual convolution weights (by means of a mask that zeroes them out). Further research is nevertheless necessary to find out if (and how) this method can become an alternative to DensEMANN, or work in tandem with it, for actually shrinking already small and efficient neural networks.
0 Replies
Loading