Keywords: Deep Learning, Convolutional Neural Network, Generalizability
TL;DR: We propose an metric named Activation Representation Substitution (ARS) to explore the association between representations learned by convolution kernels and generalization.
Abstract: Convolutional neural networks (CNNs) have achieved remarkable success in various fields due to their excellent generalizability. To explore the relationship between CNN representations and generalization, we propose an Activation Representation Substitution (ARS) metric based on the disentangled visual representations of convolution kernels. Without additional data, we obtain the disentangled visual representation of a kernel in the convolutional layer by iterating over a random image, and feed it into the CNN. The output activations of the other kernels in that convolutional layer are then investigated. When all other output activations are small, the ARS of the convolution kernel from that representation is also small, indicating that the representation is important for CNN. Our experiments with ablation analysis confirm the importance of the low ARS convolution kernel on accuracy. With ARS, we also explain batch normalization and class selectivity. By comparing the model performances on the test set, we empirically find that when the convolutional layer contains a large number of low ARS convolution kernels, the model has good generalization. ARS is a metric that can be used to better understand model generalizability without using external data.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics
Submission Guidelines: Yes
Please Choose The Closest Area That Your Submission Falls Into: Deep Learning and representational learning
Supplementary Material: zip
4 Replies
Loading