Counting the Paths in Deep Neural Networks as a Performance PredictorDownload PDF

25 Sept 2019 (modified: 05 May 2023)ICLR 2020 Conference Withdrawn SubmissionReaders: Everyone
Keywords: learning theory, deep learning, convolutional neural networks
TL;DR: A quantitative measure to predict the performances of deep neural network models.
Abstract: We propose a novel quantitative measure to predict the performance of a deep neural network classifier, where the measure is derived exclusively from the graph structure of the network. We expect that this measure is a fundamental first step in developing a method to evaluate new network architectures and reduce the reliance on the computationally expensive trial and error or "brute force" optimisation processes involved in model selection. The measure is derived in the context of multi-layer perceptrons (MLPs), but the definitions are shown to be useful also in the context of deep convolutional neural networks (CNN), where it is able to estimate and compare the relative performance of different types of neural networks, such as VGG, ResNet, and DenseNet. Our measure is also used to study the effects of some important "hidden" hyper-parameters of the DenseNet architecture, such as number of layers, growth rate and the dimension of 1x1 convolutions in DenseNet-BC. Ultimately, our measure facilitates the optimisation of the DenseNet design, which shows improved results compared to the baseline.
Code: https://www.dropbox.com/s/am7lalpkz101b1z/source_code.zip?dl=0
Original Pdf: pdf
4 Replies

Loading