Towards Unifying Neural Architecture Space Exploration and GeneralizationDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
Keywords: Neural Architecture Space Exploration, Generalization, Model Compression, Network Science, Convolutional Neural Networks
Abstract: In this paper, we address a fundamental research question of significant practical interest: Can certain theoretical characteristics of CNN architectures indicate a priori (i.e., without training) which models with highly different number of parameters and layers achieve a similar generalization performance? To answer this question, we model CNNs from a network science perspective and introduce a new, theoretically-grounded, architecture-level metric called NN-Mass. We also integrate, for the first time, the PAC-Bayes theory of generalization with small-world networks to discover new synergies among our proposed NN-Mass metric, architecture characteristics, and model generalization. With experiments on real datasets such as CIFAR-10/100, we provide extensive empirical evidence for our theoretical findings. Finally, we exploit these new insights for model compression and achieve up to 3x fewer parameters and FLOPS, while losing minimal accuracy (e.g., 96.82% vs. 97%) over large CNNs on the CIFAR-10 dataset.
Original Pdf: pdf
3 Replies

Loading