A SINGLE SHOT PCA-DRIVEN ANALYSIS OF NETWORK STRUCTURE TO REMOVE REDUNDANCYDownload PDF

27 Sept 2018 (modified: 05 May 2023)ICLR 2019 Conference Withdrawn SubmissionReaders: Everyone
Abstract: Deep learning models have outperformed traditional methods in many fields such as natural language processing and computer vision. However, despite their tremendous success, the methods of designing optimal Convolutional Neural Networks (CNNs) are still based on heuristics or grid search. The resulting networks obtained using these techniques are often overparametrized with huge computational and memory requirements. This paper focuses on a structured, explainable approach towards optimal model design that maximizes accuracy while keeping computational costs tractable. We propose a single-shot analysis of a trained CNN that uses Principal Component Analysis (PCA) to determine the number of filters that are doing significant transformations per layer, without the need for retraining. It can be interpreted as identifying the dimensionality of the hypothesis space under consideration. The proposed technique also helps estimate an optimal number of layers by looking at the expansion of dimensions as the model gets deeper. This analysis can be used to design an optimal structure of a given network on a dataset, or help to adapt a predesigned network on a new dataset. We demonstrate these techniques by optimizing VGG and AlexNet networks on CIFAR-10, CIFAR-100 and ImageNet datasets.
Keywords: deep learning, model compression, pruning, PCA
TL;DR: We present a single shot analysis of a trained neural network to remove redundancy and identify optimal network structure
4 Replies

Loading