PRUNING CNNS WITH GRAPH RANDOM WALK & RANDOM MATRIX THEORY

27 Sept 2024 (modified: 22 Nov 2024)ICLR 2025 Conference Withdrawn SubmissionEveryoneRevisionsBibTeXCC BY 4.0
Keywords: Model Compression, Graph Learning, Random Matrix Theory
Abstract: To facilitate the deployment of convolutional neural networks on resource-limited devices, filter pruning has emerged as an effective strategy because of its enabled practical acceleration. Evaluating the importance of filters is a crucial challenge in this field. Most existing works on filter pruning assess the relationships of filters using pairwise measures such as Euclidean distance and cosine correlation, which may not capture the global information within the layer. In this paper, we propose a novel filter pruning method, which leverages a graph-based approach to model the relationships among filters in convolutional layers. Each filter is represented as a node in a directed graph, and the edges between nodes capture the linear dependencies between filters. This structure allows us to assess the relative importance of each filter by conducting a random walk on the graph. Filters that exhibit weaker connections to others are considered less important and are pruned with minimal impact on model performance. Furthermore, we examine the eigenvalue spectrum of the adjacency matrix and observe a distribution similar to that of the spiked models in random matrix theory. This suggests that the spiked eigenvalues could serve as a significant indicator of the importance of each convolutional layer. We conduct image classification on CIFAR-10 and ImageNet to demonstrate the superiority of our method over the state-of-the-arts.
Primary Area: other topics in machine learning (i.e., none of the above)
Code Of Ethics: I acknowledge that I and all co-authors of this work have read and commit to adhering to the ICLR Code of Ethics.
Submission Guidelines: I certify that this submission complies with the submission instructions as described on https://iclr.cc/Conferences/2025/AuthorGuide.
Reciprocal Reviewing: I understand the reciprocal reviewing requirement as described on https://iclr.cc/Conferences/2025/CallForPapers. If none of the authors are registered as a reviewer, it may result in a desk rejection at the discretion of the program chairs. To request an exception, please complete this form at https://forms.gle/Huojr6VjkFxiQsUp6.
Anonymous Url: I certify that there is no URL (e.g., github page) that could be used to find authors’ identity.
No Acknowledgement Section: I certify that there is no acknowledgement section in this submission for double blind review.
Submission Number: 10240
Loading